Dec 01 10:30:59 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 10:30:59 crc restorecon[4718]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:30:59 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:00 crc restorecon[4718]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 10:31:00 crc kubenswrapper[4761]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 10:31:00 crc kubenswrapper[4761]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 10:31:00 crc kubenswrapper[4761]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 10:31:00 crc kubenswrapper[4761]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 10:31:00 crc kubenswrapper[4761]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 10:31:00 crc kubenswrapper[4761]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.936754 4761 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941305 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941325 4761 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941333 4761 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941338 4761 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941349 4761 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941354 4761 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941360 4761 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941366 4761 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941371 4761 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941376 4761 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941383 4761 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941389 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941394 4761 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941400 4761 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941405 4761 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941410 4761 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941415 4761 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941420 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941425 4761 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941430 4761 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941435 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941441 4761 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941446 4761 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941451 4761 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941456 4761 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941462 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941467 4761 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941474 4761 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941481 4761 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941486 4761 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941492 4761 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941498 4761 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941503 4761 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941509 4761 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941516 4761 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941522 4761 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941528 4761 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941535 4761 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941540 4761 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941564 4761 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941569 4761 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941584 4761 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941589 4761 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941595 4761 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941601 4761 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941606 4761 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941611 4761 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941616 4761 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941621 4761 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941626 4761 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941631 4761 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941637 4761 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941642 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941647 4761 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941652 4761 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941657 4761 feature_gate.go:330] unrecognized feature gate: Example Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941662 4761 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941667 4761 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941672 4761 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941678 4761 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941683 4761 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941688 4761 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941693 4761 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941700 4761 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941706 4761 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941712 4761 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941723 4761 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941731 4761 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941744 4761 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941754 4761 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.941762 4761 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942087 4761 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942104 4761 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942118 4761 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942126 4761 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942136 4761 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942142 4761 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942151 4761 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942159 4761 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942165 4761 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942172 4761 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942178 4761 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942185 4761 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942191 4761 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942198 4761 flags.go:64] FLAG: --cgroup-root="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942204 4761 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942210 4761 flags.go:64] FLAG: --client-ca-file="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942216 4761 flags.go:64] FLAG: --cloud-config="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942222 4761 flags.go:64] FLAG: --cloud-provider="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942228 4761 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942237 4761 flags.go:64] FLAG: --cluster-domain="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942244 4761 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942250 4761 flags.go:64] FLAG: --config-dir="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942256 4761 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942264 4761 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942274 4761 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942280 4761 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942287 4761 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942293 4761 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942300 4761 flags.go:64] FLAG: --contention-profiling="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942306 4761 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942312 4761 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942319 4761 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942326 4761 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942334 4761 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942341 4761 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942347 4761 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942355 4761 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942361 4761 flags.go:64] FLAG: --enable-server="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942367 4761 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942376 4761 flags.go:64] FLAG: --event-burst="100" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942382 4761 flags.go:64] FLAG: --event-qps="50" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942388 4761 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942395 4761 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942401 4761 flags.go:64] FLAG: --eviction-hard="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942409 4761 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942415 4761 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942421 4761 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942428 4761 flags.go:64] FLAG: --eviction-soft="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942434 4761 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942440 4761 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942446 4761 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942452 4761 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942459 4761 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942465 4761 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942471 4761 flags.go:64] FLAG: --feature-gates="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942478 4761 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942485 4761 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942492 4761 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942498 4761 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942504 4761 flags.go:64] FLAG: --healthz-port="10248" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942510 4761 flags.go:64] FLAG: --help="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942516 4761 flags.go:64] FLAG: --hostname-override="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942523 4761 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942529 4761 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942536 4761 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942542 4761 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942565 4761 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942571 4761 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942579 4761 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942585 4761 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942591 4761 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942597 4761 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942604 4761 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942610 4761 flags.go:64] FLAG: --kube-reserved="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942616 4761 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942622 4761 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942629 4761 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942635 4761 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942641 4761 flags.go:64] FLAG: --lock-file="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942647 4761 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942653 4761 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942659 4761 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942668 4761 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942675 4761 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942681 4761 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942687 4761 flags.go:64] FLAG: --logging-format="text" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942693 4761 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942700 4761 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942707 4761 flags.go:64] FLAG: --manifest-url="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942712 4761 flags.go:64] FLAG: --manifest-url-header="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942721 4761 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942727 4761 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942735 4761 flags.go:64] FLAG: --max-pods="110" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942741 4761 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942747 4761 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942753 4761 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942760 4761 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942766 4761 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942772 4761 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942778 4761 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942795 4761 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942801 4761 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942808 4761 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942814 4761 flags.go:64] FLAG: --pod-cidr="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942820 4761 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942830 4761 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942836 4761 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942842 4761 flags.go:64] FLAG: --pods-per-core="0" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942848 4761 flags.go:64] FLAG: --port="10250" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942854 4761 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942860 4761 flags.go:64] FLAG: --provider-id="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942866 4761 flags.go:64] FLAG: --qos-reserved="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942872 4761 flags.go:64] FLAG: --read-only-port="10255" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942878 4761 flags.go:64] FLAG: --register-node="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942884 4761 flags.go:64] FLAG: --register-schedulable="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942890 4761 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942912 4761 flags.go:64] FLAG: --registry-burst="10" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942918 4761 flags.go:64] FLAG: --registry-qps="5" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942924 4761 flags.go:64] FLAG: --reserved-cpus="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942930 4761 flags.go:64] FLAG: --reserved-memory="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942939 4761 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942945 4761 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942952 4761 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942958 4761 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942964 4761 flags.go:64] FLAG: --runonce="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942970 4761 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942977 4761 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942983 4761 flags.go:64] FLAG: --seccomp-default="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942989 4761 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.942996 4761 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943002 4761 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943009 4761 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943015 4761 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943021 4761 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943028 4761 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943034 4761 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943040 4761 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943046 4761 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943053 4761 flags.go:64] FLAG: --system-cgroups="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943059 4761 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943070 4761 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943076 4761 flags.go:64] FLAG: --tls-cert-file="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943082 4761 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943090 4761 flags.go:64] FLAG: --tls-min-version="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943096 4761 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943103 4761 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943110 4761 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943116 4761 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943122 4761 flags.go:64] FLAG: --v="2" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943130 4761 flags.go:64] FLAG: --version="false" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943139 4761 flags.go:64] FLAG: --vmodule="" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943146 4761 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943153 4761 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943353 4761 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943360 4761 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943366 4761 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943374 4761 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943381 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943387 4761 feature_gate.go:330] unrecognized feature gate: Example Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943393 4761 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943399 4761 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943404 4761 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943409 4761 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943414 4761 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943420 4761 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943426 4761 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943431 4761 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943436 4761 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943441 4761 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943446 4761 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943451 4761 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943456 4761 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943461 4761 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943466 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943471 4761 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943476 4761 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943482 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943488 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943493 4761 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943498 4761 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943503 4761 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943508 4761 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943513 4761 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943519 4761 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943524 4761 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943529 4761 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943533 4761 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943539 4761 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943560 4761 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943566 4761 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943571 4761 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943576 4761 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943581 4761 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943586 4761 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943591 4761 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943597 4761 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943602 4761 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943610 4761 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943615 4761 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943622 4761 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943629 4761 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943635 4761 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943641 4761 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943646 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943652 4761 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943657 4761 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943663 4761 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943670 4761 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943678 4761 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943683 4761 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943688 4761 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943694 4761 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943700 4761 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943706 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943711 4761 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943718 4761 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943724 4761 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943729 4761 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943734 4761 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943741 4761 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943749 4761 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943757 4761 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943764 4761 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.943771 4761 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.943790 4761 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.956046 4761 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.956116 4761 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956347 4761 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956372 4761 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956383 4761 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956393 4761 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956404 4761 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956414 4761 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956423 4761 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956432 4761 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956442 4761 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956451 4761 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956460 4761 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956489 4761 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956500 4761 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956510 4761 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956521 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956530 4761 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956539 4761 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956588 4761 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956599 4761 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956609 4761 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956618 4761 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956628 4761 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956639 4761 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956648 4761 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956657 4761 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956667 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956676 4761 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956686 4761 feature_gate.go:330] unrecognized feature gate: Example Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956696 4761 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956706 4761 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956716 4761 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956727 4761 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956737 4761 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956748 4761 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956780 4761 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956798 4761 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956810 4761 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956821 4761 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956835 4761 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956847 4761 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956858 4761 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956869 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956879 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956890 4761 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956902 4761 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956914 4761 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956924 4761 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956935 4761 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956945 4761 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956955 4761 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956968 4761 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956980 4761 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.956992 4761 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957003 4761 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957014 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957023 4761 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957034 4761 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957044 4761 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957054 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957064 4761 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957078 4761 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957090 4761 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957100 4761 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957112 4761 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957123 4761 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957133 4761 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957143 4761 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957153 4761 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957163 4761 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957173 4761 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957201 4761 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.957219 4761 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957670 4761 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957696 4761 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957710 4761 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957722 4761 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957733 4761 feature_gate.go:330] unrecognized feature gate: Example Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957743 4761 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957754 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957765 4761 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957777 4761 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957788 4761 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957798 4761 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957808 4761 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957852 4761 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957863 4761 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957873 4761 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957883 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957893 4761 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957903 4761 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957913 4761 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957926 4761 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957938 4761 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957949 4761 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957961 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957971 4761 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957981 4761 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.957992 4761 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958001 4761 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958012 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958022 4761 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958032 4761 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958042 4761 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958052 4761 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958062 4761 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958073 4761 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958109 4761 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958120 4761 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958130 4761 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958140 4761 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958150 4761 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958161 4761 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958172 4761 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958186 4761 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958199 4761 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958210 4761 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958221 4761 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958231 4761 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958240 4761 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958250 4761 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958264 4761 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958278 4761 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958289 4761 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958299 4761 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958309 4761 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958320 4761 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958331 4761 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958342 4761 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958352 4761 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958362 4761 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958372 4761 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958383 4761 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958393 4761 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958402 4761 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958412 4761 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958422 4761 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958433 4761 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958442 4761 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958452 4761 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958462 4761 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958471 4761 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958482 4761 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 10:31:00 crc kubenswrapper[4761]: W1201 10:31:00.958511 4761 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.958528 4761 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.958996 4761 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.963783 4761 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.963889 4761 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.964610 4761 server.go:997] "Starting client certificate rotation" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.964633 4761 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.965048 4761 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-04 19:15:46.870451383 +0000 UTC Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.965169 4761 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 824h44m45.905287257s for next certificate rotation Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.977579 4761 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.979990 4761 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 10:31:00 crc kubenswrapper[4761]: I1201 10:31:00.988628 4761 log.go:25] "Validated CRI v1 runtime API" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.014202 4761 log.go:25] "Validated CRI v1 image API" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.016304 4761 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.020047 4761 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-10-26-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.020094 4761 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.053031 4761 manager.go:217] Machine: {Timestamp:2025-12-01 10:31:01.0492385 +0000 UTC m=+0.352997194 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ec505933-0668-4f39-8d86-8e4b6f0f3c38 BootID:e43c0780-f8b7-40cc-82a5-0e835247b9ef Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:93:e8:74 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:93:e8:74 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:8f:35:38 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d9:c5:74 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2e:4f:39 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:14:eb:43 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:de:5a:b4:c8:b2:6b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0e:b4:02:71:f2:ff Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.053448 4761 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.053704 4761 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.054197 4761 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.054520 4761 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.054658 4761 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.055209 4761 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.055232 4761 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.055587 4761 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.055640 4761 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.056071 4761 state_mem.go:36] "Initialized new in-memory state store" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.056314 4761 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.059475 4761 kubelet.go:418] "Attempting to sync node with API server" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.059527 4761 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.059595 4761 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.059627 4761 kubelet.go:324] "Adding apiserver pod source" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.059654 4761 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.062533 4761 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.063277 4761 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 10:31:01 crc kubenswrapper[4761]: W1201 10:31:01.064090 4761 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:01 crc kubenswrapper[4761]: W1201 10:31:01.064188 4761 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.064462 4761 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.88:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.064412 4761 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.88:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.068428 4761 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069304 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069349 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069373 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069388 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069410 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069424 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069438 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069461 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069478 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069493 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069512 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.069526 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.074044 4761 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.074870 4761 server.go:1280] "Started kubelet" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.075393 4761 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.075576 4761 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.075532 4761 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 10:31:01 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.076726 4761 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.077914 4761 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.078014 4761 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.078183 4761 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.078212 4761 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.078383 4761 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.078602 4761 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.078995 4761 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 06:50:38.816578308 +0000 UTC Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.079055 4761 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 212h19m37.737525845s for next certificate rotation Dec 01 10:31:01 crc kubenswrapper[4761]: W1201 10:31:01.079430 4761 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.079492 4761 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.88:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.079659 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="200ms" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.080044 4761 factory.go:55] Registering systemd factory Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.080166 4761 factory.go:221] Registration of the systemd container factory successfully Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.080603 4761 server.go:460] "Adding debug handlers to kubelet server" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.080663 4761 factory.go:153] Registering CRI-O factory Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.080681 4761 factory.go:221] Registration of the crio container factory successfully Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.084066 4761 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.086174 4761 factory.go:103] Registering Raw factory Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.086378 4761 manager.go:1196] Started watching for new ooms in manager Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.080491 4761 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.88:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d10c38d531cf9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:31:01.074803961 +0000 UTC m=+0.378562615,LastTimestamp:2025-12-01 10:31:01.074803961 +0000 UTC m=+0.378562615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.091428 4761 manager.go:319] Starting recovery of all containers Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096052 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096118 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096138 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096156 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096174 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096192 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096216 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096233 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096254 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096271 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096287 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096305 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096322 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096341 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096359 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096380 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096398 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096415 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096433 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096450 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096466 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096484 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096502 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096579 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096604 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096621 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096664 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096684 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096702 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096717 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096734 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096751 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096769 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096785 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096802 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096818 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096834 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096850 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096866 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096882 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096899 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096914 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096932 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096946 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096963 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096982 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.096999 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097016 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097031 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097047 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097065 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097088 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097110 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097128 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097145 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097163 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097180 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097197 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097242 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097260 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097275 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097291 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097308 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097325 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097341 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097355 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097371 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097388 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097403 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097419 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097484 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097504 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097522 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097537 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097576 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097623 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097664 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097682 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097721 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097739 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097756 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.097773 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099541 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099585 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099603 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099626 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099643 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099660 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099677 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099702 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099719 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099738 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099755 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099774 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099790 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099809 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099826 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099843 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099861 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099878 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099895 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099913 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099932 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099949 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099977 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.099998 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.100018 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.100041 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.100095 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.100142 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.102447 4761 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.102822 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.102942 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.103293 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.104488 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.104634 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105621 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105726 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105757 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105775 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105789 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105802 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105814 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105824 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105835 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105847 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105858 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105875 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105890 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105903 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105916 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105928 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105939 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105951 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105962 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105974 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105985 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.105996 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106006 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106019 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106029 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106039 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106050 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106072 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106091 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106111 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106126 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106137 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106148 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106158 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106169 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106180 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106191 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106201 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106214 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106224 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106233 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106242 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106252 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106263 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106273 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106284 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106293 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106304 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106320 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106331 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106340 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106350 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106360 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106369 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106380 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106392 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106402 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106412 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106430 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106440 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106449 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106459 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106468 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106477 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106487 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106496 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106505 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106516 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106539 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106594 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106606 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106618 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106633 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106646 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106657 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106671 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106684 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106694 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106705 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106715 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106725 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106736 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106747 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106757 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106769 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106814 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106829 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106845 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106857 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106868 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106879 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106891 4761 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106900 4761 reconstruct.go:97] "Volume reconstruction finished" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.106908 4761 reconciler.go:26] "Reconciler: start to sync state" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.114268 4761 manager.go:324] Recovery completed Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.122869 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.124041 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.124078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.124087 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.124881 4761 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.124901 4761 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.124917 4761 state_mem.go:36] "Initialized new in-memory state store" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.125422 4761 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.127080 4761 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.127123 4761 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.127152 4761 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.127197 4761 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.179499 4761 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 10:31:01 crc kubenswrapper[4761]: W1201 10:31:01.212986 4761 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.213097 4761 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.88:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.227654 4761 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.245232 4761 policy_none.go:49] "None policy: Start" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.246488 4761 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.246523 4761 state_mem.go:35] "Initializing new in-memory state store" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.279666 4761 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.280809 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="400ms" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.319959 4761 manager.go:334] "Starting Device Plugin manager" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.320016 4761 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.320031 4761 server.go:79] "Starting device plugin registration server" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.322425 4761 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.322446 4761 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.322627 4761 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.322769 4761 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.322779 4761 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.331154 4761 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.423615 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.425154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.425205 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.425222 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.425255 4761 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.425994 4761 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.88:6443: connect: connection refused" node="crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.428203 4761 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.428295 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.429920 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.429988 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.430013 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.430274 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.430475 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.430513 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.431672 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.431701 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.431712 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.431678 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.431740 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.431754 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.431859 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.432114 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.432173 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.432696 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.432762 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.432787 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.432981 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.433092 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.433144 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.434094 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.434125 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.434139 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.434202 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.434239 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.434262 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.435417 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.435489 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.435513 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.435816 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.436019 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.436091 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.437188 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.437220 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.437246 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.437346 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.437429 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.437454 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.437465 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.437497 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.438294 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.438318 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.438330 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.510710 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.510769 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.510807 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.510872 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.510922 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511003 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511061 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511093 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511128 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511156 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511231 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511310 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511359 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511389 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.511418 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613273 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613360 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613391 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613418 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613449 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613477 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613505 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613534 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613620 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613591 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613633 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613650 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613648 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613680 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613686 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613711 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613733 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613734 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613744 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.613789 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614174 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614235 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614308 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614356 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614398 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614461 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614473 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614510 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614521 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.614597 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.626450 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.627848 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.627935 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.627949 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.627977 4761 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.628585 4761 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.88:6443: connect: connection refused" node="crc" Dec 01 10:31:01 crc kubenswrapper[4761]: E1201 10:31:01.682287 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="800ms" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.753222 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.759770 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: W1201 10:31:01.779493 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-3aa3251390585cef011c0f14db8d0589a1d52e56a930eb91b898b4d552a6208a WatchSource:0}: Error finding container 3aa3251390585cef011c0f14db8d0589a1d52e56a930eb91b898b4d552a6208a: Status 404 returned error can't find the container with id 3aa3251390585cef011c0f14db8d0589a1d52e56a930eb91b898b4d552a6208a Dec 01 10:31:01 crc kubenswrapper[4761]: W1201 10:31:01.784103 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-76960eb03c6e6d7e3a829e72cff8d76d1e25febc08373f7762b38a0c83a84747 WatchSource:0}: Error finding container 76960eb03c6e6d7e3a829e72cff8d76d1e25febc08373f7762b38a0c83a84747: Status 404 returned error can't find the container with id 76960eb03c6e6d7e3a829e72cff8d76d1e25febc08373f7762b38a0c83a84747 Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.793829 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.808095 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: I1201 10:31:01.815202 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:01 crc kubenswrapper[4761]: W1201 10:31:01.832178 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-186dc4ae1709a4b37efddcc4e40a137c6f2997fd249c878ed752fdd006378a5b WatchSource:0}: Error finding container 186dc4ae1709a4b37efddcc4e40a137c6f2997fd249c878ed752fdd006378a5b: Status 404 returned error can't find the container with id 186dc4ae1709a4b37efddcc4e40a137c6f2997fd249c878ed752fdd006378a5b Dec 01 10:31:01 crc kubenswrapper[4761]: W1201 10:31:01.833108 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-0ee1b58ec7d6402f9eba40755c99891910aff41466f8af860fae12395547e25d WatchSource:0}: Error finding container 0ee1b58ec7d6402f9eba40755c99891910aff41466f8af860fae12395547e25d: Status 404 returned error can't find the container with id 0ee1b58ec7d6402f9eba40755c99891910aff41466f8af860fae12395547e25d Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.029713 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.030625 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.030652 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.030661 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.030685 4761 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:02 crc kubenswrapper[4761]: E1201 10:31:02.031058 4761 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.88:6443: connect: connection refused" node="crc" Dec 01 10:31:02 crc kubenswrapper[4761]: W1201 10:31:02.065105 4761 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:02 crc kubenswrapper[4761]: E1201 10:31:02.065184 4761 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.88:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.077036 4761 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.130380 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1e98443930e53bb099eb34a13188d050c09e614822ec78dbd0c2d57c74394fd0"} Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.131575 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"186dc4ae1709a4b37efddcc4e40a137c6f2997fd249c878ed752fdd006378a5b"} Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.132414 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0ee1b58ec7d6402f9eba40755c99891910aff41466f8af860fae12395547e25d"} Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.133326 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3aa3251390585cef011c0f14db8d0589a1d52e56a930eb91b898b4d552a6208a"} Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.134491 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"76960eb03c6e6d7e3a829e72cff8d76d1e25febc08373f7762b38a0c83a84747"} Dec 01 10:31:02 crc kubenswrapper[4761]: W1201 10:31:02.295434 4761 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:02 crc kubenswrapper[4761]: E1201 10:31:02.295888 4761 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.88:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:02 crc kubenswrapper[4761]: W1201 10:31:02.395642 4761 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:02 crc kubenswrapper[4761]: E1201 10:31:02.395786 4761 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.88:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:02 crc kubenswrapper[4761]: W1201 10:31:02.465618 4761 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:02 crc kubenswrapper[4761]: E1201 10:31:02.465685 4761 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.88:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:02 crc kubenswrapper[4761]: E1201 10:31:02.483925 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="1.6s" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.831925 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.833836 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.833888 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.833905 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:02 crc kubenswrapper[4761]: I1201 10:31:02.833938 4761 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:02 crc kubenswrapper[4761]: E1201 10:31:02.834478 4761 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.88:6443: connect: connection refused" node="crc" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.076024 4761 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.88:6443: connect: connection refused Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.139442 4761 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83" exitCode=0 Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.139489 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83"} Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.139571 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.140593 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.140652 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.140673 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.141490 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea" exitCode=0 Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.141611 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.141621 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea"} Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.142623 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.142655 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.142672 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.144099 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.144643 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0"} Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.144672 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2"} Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.144683 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d"} Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.144694 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398"} Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.144769 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.145028 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.145065 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.145082 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.145259 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.145288 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.145301 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.146421 4761 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a" exitCode=0 Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.146517 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.146666 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a"} Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.147777 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.147808 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.147825 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.149139 4761 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="07d5e1f94061c640712c48384a939b06bf428350b9556adf244309e9ab2e899d" exitCode=0 Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.149186 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"07d5e1f94061c640712c48384a939b06bf428350b9556adf244309e9ab2e899d"} Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.149211 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.150488 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.150617 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:03 crc kubenswrapper[4761]: I1201 10:31:03.150641 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.153251 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7"} Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.153294 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326"} Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.153304 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a"} Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.153383 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.154663 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.154692 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.154704 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.173510 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655"} Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.173580 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d"} Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.173597 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69"} Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.175082 4761 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359" exitCode=0 Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.175214 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.175222 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359"} Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.176183 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.176242 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.176256 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.178398 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.178421 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.178484 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"98f5fedd93471d67b094b64485d810027122bd9557ab170623f04cffc87d2c19"} Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.179334 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.179353 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.179339 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.179363 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.179378 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.179461 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.434668 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.436470 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.436514 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.436526 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.436572 4761 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:04 crc kubenswrapper[4761]: I1201 10:31:04.997401 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.186733 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec"} Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.186820 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735"} Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.186761 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.188109 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.188144 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.188160 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.190000 4761 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c" exitCode=0 Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.190123 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.190171 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.190192 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c"} Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.190229 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.190333 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.190725 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.192832 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.192903 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.192926 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.192916 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.193124 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.193154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.193448 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.193694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.193867 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.196867 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.196928 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.196947 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:05 crc kubenswrapper[4761]: I1201 10:31:05.232485 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.198714 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d"} Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.198771 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd"} Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.198784 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c"} Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.198795 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5"} Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.198811 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.198875 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.198872 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.200170 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.200222 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.200234 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.200261 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.200317 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:06 crc kubenswrapper[4761]: I1201 10:31:06.200345 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.208355 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38"} Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.208645 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.210222 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.210323 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.210411 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.676083 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.676433 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.678167 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.678224 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.678243 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.998108 4761 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 10:31:07 crc kubenswrapper[4761]: I1201 10:31:07.998243 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.034635 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.034840 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.036286 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.036315 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.036328 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.045419 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.211007 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.211151 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.216581 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.216640 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.216653 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.216751 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.216764 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.216776 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.463051 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.463294 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.464530 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.464612 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.464628 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:08 crc kubenswrapper[4761]: I1201 10:31:08.477731 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.214140 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.215272 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.215321 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.215338 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.228772 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.229002 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.230395 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.230429 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:09 crc kubenswrapper[4761]: I1201 10:31:09.230443 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:10 crc kubenswrapper[4761]: I1201 10:31:10.041886 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:10 crc kubenswrapper[4761]: I1201 10:31:10.042071 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:31:10 crc kubenswrapper[4761]: I1201 10:31:10.042128 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:10 crc kubenswrapper[4761]: I1201 10:31:10.043544 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:10 crc kubenswrapper[4761]: I1201 10:31:10.043602 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:10 crc kubenswrapper[4761]: I1201 10:31:10.043616 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.129933 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.130110 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.131438 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.131518 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.131534 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:11 crc kubenswrapper[4761]: E1201 10:31:11.331292 4761 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.449079 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.449274 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.451227 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.451295 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:11 crc kubenswrapper[4761]: I1201 10:31:11.451315 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:14 crc kubenswrapper[4761]: I1201 10:31:14.078423 4761 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 10:31:14 crc kubenswrapper[4761]: E1201 10:31:14.085652 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 01 10:31:14 crc kubenswrapper[4761]: E1201 10:31:14.438067 4761 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 01 10:31:14 crc kubenswrapper[4761]: E1201 10:31:14.511753 4761 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187d10c38d531cf9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:31:01.074803961 +0000 UTC m=+0.378562615,LastTimestamp:2025-12-01 10:31:01.074803961 +0000 UTC m=+0.378562615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:31:14 crc kubenswrapper[4761]: I1201 10:31:14.568415 4761 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 10:31:14 crc kubenswrapper[4761]: I1201 10:31:14.568483 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 10:31:14 crc kubenswrapper[4761]: I1201 10:31:14.578274 4761 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 10:31:14 crc kubenswrapper[4761]: I1201 10:31:14.578362 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.638263 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.640159 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.640437 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.640503 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.640606 4761 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:17 crc kubenswrapper[4761]: E1201 10:31:17.646205 4761 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.676915 4761 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.677024 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.998331 4761 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 10:31:17 crc kubenswrapper[4761]: I1201 10:31:17.998444 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:31:18 crc kubenswrapper[4761]: I1201 10:31:18.470830 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:18 crc kubenswrapper[4761]: I1201 10:31:18.471473 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:18 crc kubenswrapper[4761]: I1201 10:31:18.472007 4761 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 10:31:18 crc kubenswrapper[4761]: I1201 10:31:18.472114 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 10:31:18 crc kubenswrapper[4761]: I1201 10:31:18.473916 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:18 crc kubenswrapper[4761]: I1201 10:31:18.473951 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:18 crc kubenswrapper[4761]: I1201 10:31:18.473964 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:18 crc kubenswrapper[4761]: I1201 10:31:18.478165 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.240994 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.241490 4761 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.241617 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.242665 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.242706 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.242717 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.576180 4761 trace.go:236] Trace[909881398]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 10:31:05.282) (total time: 14293ms): Dec 01 10:31:19 crc kubenswrapper[4761]: Trace[909881398]: ---"Objects listed" error: 14293ms (10:31:19.576) Dec 01 10:31:19 crc kubenswrapper[4761]: Trace[909881398]: [14.293274345s] [14.293274345s] END Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.576249 4761 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.576281 4761 trace.go:236] Trace[1820621528]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 10:31:05.300) (total time: 14275ms): Dec 01 10:31:19 crc kubenswrapper[4761]: Trace[1820621528]: ---"Objects listed" error: 14275ms (10:31:19.576) Dec 01 10:31:19 crc kubenswrapper[4761]: Trace[1820621528]: [14.275859802s] [14.275859802s] END Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.576324 4761 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.577209 4761 trace.go:236] Trace[2135531870]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 10:31:04.724) (total time: 14852ms): Dec 01 10:31:19 crc kubenswrapper[4761]: Trace[2135531870]: ---"Objects listed" error: 14852ms (10:31:19.577) Dec 01 10:31:19 crc kubenswrapper[4761]: Trace[2135531870]: [14.852978287s] [14.852978287s] END Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.577253 4761 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.578336 4761 trace.go:236] Trace[988211449]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 10:31:05.502) (total time: 14075ms): Dec 01 10:31:19 crc kubenswrapper[4761]: Trace[988211449]: ---"Objects listed" error: 14075ms (10:31:19.578) Dec 01 10:31:19 crc kubenswrapper[4761]: Trace[988211449]: [14.075432654s] [14.075432654s] END Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.578381 4761 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 10:31:19 crc kubenswrapper[4761]: I1201 10:31:19.579599 4761 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.073777 4761 apiserver.go:52] "Watching apiserver" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.077244 4761 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.077732 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.078305 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.078408 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.078537 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.078845 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.078971 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.079056 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.079140 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.079338 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.079439 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082019 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082102 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082157 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082206 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082266 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082291 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082308 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082314 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082466 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082473 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082533 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082590 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082619 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082644 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.082670 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.083262 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.083387 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.083594 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.083690 4761 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.083709 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.083776 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.087283 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.087356 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.091064 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.093133 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.096078 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.096538 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.098082 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.098140 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.098170 4761 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.098295 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:20.598261217 +0000 UTC m=+19.902019971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.103247 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.103448 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.103635 4761 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.103842 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:20.603814156 +0000 UTC m=+19.907572910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.104449 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.115927 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.117082 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.125715 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.134086 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.148278 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.150873 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.163500 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: W1201 10:31:20.167363 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-42018c49ef77525dbe8062f28582fbaa10053a7238167d989ac4444edefc3716 WatchSource:0}: Error finding container 42018c49ef77525dbe8062f28582fbaa10053a7238167d989ac4444edefc3716: Status 404 returned error can't find the container with id 42018c49ef77525dbe8062f28582fbaa10053a7238167d989ac4444edefc3716 Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.175986 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.178854 4761 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.184393 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.184656 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.184820 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.184996 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.185139 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.185290 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.185444 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.185640 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186111 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186688 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186724 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186915 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186931 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.184686 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186958 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.184862 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.185398 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.185678 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.185720 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186040 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186238 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186301 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186415 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.186973 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187048 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187069 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187085 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187102 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187118 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187133 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187149 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187163 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187178 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187179 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187194 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187211 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187228 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187244 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187259 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187275 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187291 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187305 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187323 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187339 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187354 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187370 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187385 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187399 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187414 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187430 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187446 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187463 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187506 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187522 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187537 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187577 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187597 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187612 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187627 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187641 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187656 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187671 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187688 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187719 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187736 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187752 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187767 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187782 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187797 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187810 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187825 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187839 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187858 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187873 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187889 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187905 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187920 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187934 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187948 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187962 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187979 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188010 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188026 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188041 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188057 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188071 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188090 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188110 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188132 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188152 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188176 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188196 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188215 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188236 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188253 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188269 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188288 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188306 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188323 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188340 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188369 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188386 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188400 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188415 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188436 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188452 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188468 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188483 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188498 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188514 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188529 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188571 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188597 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188619 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188685 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187208 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187472 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187500 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187693 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.187801 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188181 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188418 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188455 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188653 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188780 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.188937 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189077 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189134 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189283 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189285 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189290 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189521 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189772 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189769 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.189935 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190089 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190152 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190163 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190390 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190414 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190419 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190466 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190566 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190701 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190808 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190815 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190971 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.190991 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191065 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191200 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191297 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191308 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191307 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191383 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191411 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191478 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191506 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191543 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191634 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191655 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191676 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191695 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191713 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191670 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191734 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191754 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191773 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191812 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191831 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191864 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191943 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191967 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.191989 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192011 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192033 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192055 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192075 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192099 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192122 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192145 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192165 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192186 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192207 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192227 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192247 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192269 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192294 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192312 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192333 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192324 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192353 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192375 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192380 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192394 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192415 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192436 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192457 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192479 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192501 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192522 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192561 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192585 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192610 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192704 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192728 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192751 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192778 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192804 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192831 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192855 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192879 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192905 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192591 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192960 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.192988 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193087 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193110 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193174 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193205 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193272 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193519 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193592 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193618 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193638 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193683 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193705 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193726 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193748 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193769 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193794 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193819 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193840 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193860 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193881 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193900 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193922 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193943 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193967 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193988 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194009 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194032 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194312 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194342 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194369 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194396 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194423 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194442 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194462 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194480 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194497 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194564 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194660 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194689 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194727 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194822 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194836 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194846 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194855 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194868 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194881 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194894 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194907 4761 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194919 4761 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194932 4761 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194945 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194957 4761 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194972 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194987 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194999 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195012 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195025 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195037 4761 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195050 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195062 4761 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195075 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195087 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195100 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195113 4761 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195126 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195137 4761 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195146 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195155 4761 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195165 4761 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195174 4761 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195183 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195192 4761 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195201 4761 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195210 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195219 4761 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195229 4761 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195237 4761 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195247 4761 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195256 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195266 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195275 4761 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195285 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195293 4761 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195277 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195302 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195462 4761 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195481 4761 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195494 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195505 4761 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195516 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195529 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195540 4761 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195708 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195724 4761 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195736 4761 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195747 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.201064 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.201572 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193170 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.193340 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.194243 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195397 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202865 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202898 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195391 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.195647 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.196032 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.196227 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.196200 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.196368 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202742 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.197115 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.197149 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.197714 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.197748 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.198007 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.198102 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.198180 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.198273 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.198351 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.198620 4761 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.203301 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:20.703281628 +0000 UTC m=+20.007040352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203353 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.198874 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.199210 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.199312 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.199399 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.199404 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.199743 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.199777 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.199860 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.199998 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200127 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200211 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200345 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200356 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200370 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200498 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200608 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203836 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200740 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200895 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200923 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200936 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200944 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.200905 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203927 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.201091 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.201124 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.201231 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.201332 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202061 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.202105 4761 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.204184 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:20.704155333 +0000 UTC m=+20.007914057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202073 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202186 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.201403 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.204262 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202208 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202194 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202197 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202229 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202252 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202704 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202721 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202743 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.202725 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203380 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203418 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203482 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203504 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203511 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203523 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.198880 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203647 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.203827 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.204316 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.204569 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.204780 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.204822 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205129 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205096 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205480 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205481 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205456 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205643 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205721 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205719 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205789 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.205810 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206134 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206163 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206222 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206404 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206430 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206495 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206590 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206592 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206700 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206754 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206778 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206462 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.206945 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.207024 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.207310 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.207411 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.207456 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.207477 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.207514 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.207662 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.207969 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:20.706357176 +0000 UTC m=+20.010115800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.208082 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.208709 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.210034 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.211045 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.211905 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.212239 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.212283 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.212526 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.212590 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.213120 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.213218 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.213075 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.213497 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.213899 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.213969 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214151 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214285 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214349 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214402 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214673 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214686 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214767 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214913 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214837 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.214868 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.215696 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.216416 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.221715 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.235435 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.238766 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.239923 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.250243 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.256204 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec" exitCode=255 Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.256493 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.256308 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec"} Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.258607 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"42018c49ef77525dbe8062f28582fbaa10053a7238167d989ac4444edefc3716"} Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.268314 4761 scope.go:117] "RemoveContainer" containerID="6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.269464 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.273053 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.292865 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302676 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302699 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302708 4761 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302717 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302726 4761 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302734 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302742 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302750 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302758 4761 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302766 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302774 4761 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302782 4761 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302790 4761 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302798 4761 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302806 4761 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302813 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302821 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302828 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302836 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302845 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302852 4761 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302860 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302869 4761 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302877 4761 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302885 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302892 4761 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302902 4761 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302910 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302918 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302939 4761 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302947 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302955 4761 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302963 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302971 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302978 4761 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302985 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.302993 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303000 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303008 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303015 4761 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303023 4761 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303034 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303042 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303049 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303067 4761 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303076 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303083 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303091 4761 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303099 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303106 4761 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303113 4761 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303121 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303128 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303148 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303155 4761 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303163 4761 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303171 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303179 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303187 4761 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303195 4761 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303203 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303211 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303219 4761 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303227 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303235 4761 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303242 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303252 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303260 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303269 4761 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303277 4761 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303285 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303293 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303300 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303325 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303332 4761 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303340 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303348 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303356 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303364 4761 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303371 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303379 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303387 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303395 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303403 4761 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303411 4761 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303418 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303426 4761 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303434 4761 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303441 4761 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303449 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303456 4761 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303465 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303472 4761 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303480 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303500 4761 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303508 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303517 4761 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303524 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303532 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303539 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303573 4761 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303581 4761 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303588 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303600 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303607 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303615 4761 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303622 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303630 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303638 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303645 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303653 4761 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303661 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303672 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303679 4761 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303687 4761 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303695 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303703 4761 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303710 4761 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303718 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303726 4761 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303734 4761 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303741 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303749 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303757 4761 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303765 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303772 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303780 4761 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303788 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303796 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303804 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303812 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303820 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303828 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303839 4761 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303849 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303856 4761 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303866 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303873 4761 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.303880 4761 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.314395 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.316982 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.325692 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.334863 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.345789 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.444065 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.459992 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:20 crc kubenswrapper[4761]: W1201 10:31:20.474999 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-4f75a85c39d60bbb554aaa96baf68da6231dbcff308eff47b07e4fc947366f7d WatchSource:0}: Error finding container 4f75a85c39d60bbb554aaa96baf68da6231dbcff308eff47b07e4fc947366f7d: Status 404 returned error can't find the container with id 4f75a85c39d60bbb554aaa96baf68da6231dbcff308eff47b07e4fc947366f7d Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.606083 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.606161 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.606363 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.606397 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.606414 4761 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.606478 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:21.606455556 +0000 UTC m=+20.910214190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.606603 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.606626 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.606641 4761 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.606685 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:21.606670962 +0000 UTC m=+20.910429606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.707305 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.707388 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.707432 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.707562 4761 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.707621 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:21.707602546 +0000 UTC m=+21.011361180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.707680 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:21.707673258 +0000 UTC m=+21.011431882 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.707718 4761 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: E1201 10:31:20.707763 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:21.70775549 +0000 UTC m=+21.011514114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.818768 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jbqqz"] Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.819192 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jbqqz" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.821220 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.821302 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.821669 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.834745 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.848229 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.856833 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.871062 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.883971 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.895493 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.906956 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.908943 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8b958982-d6cc-45e7-b3f4-1684bfa145bc-hosts-file\") pod \"node-resolver-jbqqz\" (UID: \"8b958982-d6cc-45e7-b3f4-1684bfa145bc\") " pod="openshift-dns/node-resolver-jbqqz" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.908985 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxz47\" (UniqueName: \"kubernetes.io/projected/8b958982-d6cc-45e7-b3f4-1684bfa145bc-kube-api-access-wxz47\") pod \"node-resolver-jbqqz\" (UID: \"8b958982-d6cc-45e7-b3f4-1684bfa145bc\") " pod="openshift-dns/node-resolver-jbqqz" Dec 01 10:31:20 crc kubenswrapper[4761]: I1201 10:31:20.918631 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.009725 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8b958982-d6cc-45e7-b3f4-1684bfa145bc-hosts-file\") pod \"node-resolver-jbqqz\" (UID: \"8b958982-d6cc-45e7-b3f4-1684bfa145bc\") " pod="openshift-dns/node-resolver-jbqqz" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.009763 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxz47\" (UniqueName: \"kubernetes.io/projected/8b958982-d6cc-45e7-b3f4-1684bfa145bc-kube-api-access-wxz47\") pod \"node-resolver-jbqqz\" (UID: \"8b958982-d6cc-45e7-b3f4-1684bfa145bc\") " pod="openshift-dns/node-resolver-jbqqz" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.009858 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8b958982-d6cc-45e7-b3f4-1684bfa145bc-hosts-file\") pod \"node-resolver-jbqqz\" (UID: \"8b958982-d6cc-45e7-b3f4-1684bfa145bc\") " pod="openshift-dns/node-resolver-jbqqz" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.030093 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxz47\" (UniqueName: \"kubernetes.io/projected/8b958982-d6cc-45e7-b3f4-1684bfa145bc-kube-api-access-wxz47\") pod \"node-resolver-jbqqz\" (UID: \"8b958982-d6cc-45e7-b3f4-1684bfa145bc\") " pod="openshift-dns/node-resolver-jbqqz" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.130727 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jbqqz" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.136615 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.137089 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.138251 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.138845 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.139875 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.140477 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.141111 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.142418 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.143066 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.144104 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.144631 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.145696 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.146181 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.146700 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.147578 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.148135 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.149563 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.149968 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.150509 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.151444 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.152039 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.153453 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.154102 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.154744 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: W1201 10:31:21.155391 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b958982_d6cc_45e7_b3f4_1684bfa145bc.slice/crio-f439acde5897f1ab3feaf840d08c3315f16ec8ca6a9d1df84c69a770798970f1 WatchSource:0}: Error finding container f439acde5897f1ab3feaf840d08c3315f16ec8ca6a9d1df84c69a770798970f1: Status 404 returned error can't find the container with id f439acde5897f1ab3feaf840d08c3315f16ec8ca6a9d1df84c69a770798970f1 Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.155537 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.156211 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.156449 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.157368 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.158140 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.159098 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.159579 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.160084 4761 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.160664 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.162278 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.162876 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.163745 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.165318 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.166028 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.167717 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.168475 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.170741 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.171262 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.171747 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.173232 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.173902 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.175254 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.175753 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.177143 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.182252 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.183383 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.184195 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.184650 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.185105 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.185566 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.186098 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.187831 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.188316 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.188778 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.188814 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qjx5r"] Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.189260 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.191864 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.192118 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.192229 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.192387 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.193941 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.210366 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.221427 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.239261 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.257713 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.263882 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jbqqz" event={"ID":"8b958982-d6cc-45e7-b3f4-1684bfa145bc","Type":"ContainerStarted","Data":"f439acde5897f1ab3feaf840d08c3315f16ec8ca6a9d1df84c69a770798970f1"} Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.264146 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.266057 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a"} Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.266093 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"37b01c167850f94980d61cacd6726bcd03281846db0798cba9cd9977ce96d39b"} Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.276666 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.281064 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592"} Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.281132 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.287308 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51"} Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.287506 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a"} Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.288331 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4f75a85c39d60bbb554aaa96baf68da6231dbcff308eff47b07e4fc947366f7d"} Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.294744 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.307188 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.311793 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-rootfs\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.311937 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-proxy-tls\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.312077 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvvs2\" (UniqueName: \"kubernetes.io/projected/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-kube-api-access-wvvs2\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.312193 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-mcd-auth-proxy-config\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.323042 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.340976 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.354845 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.371842 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.383131 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.394799 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.410366 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.412680 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-rootfs\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.412723 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-proxy-tls\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.412744 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvvs2\" (UniqueName: \"kubernetes.io/projected/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-kube-api-access-wvvs2\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.412792 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-mcd-auth-proxy-config\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.413166 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-rootfs\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.414008 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-mcd-auth-proxy-config\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.416949 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-proxy-tls\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.427229 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.429892 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvvs2\" (UniqueName: \"kubernetes.io/projected/eaf56ffe-a6c0-446a-81db-deae9bd72c7c-kube-api-access-wvvs2\") pod \"machine-config-daemon-qjx5r\" (UID: \"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\") " pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.439207 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.477838 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.501233 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.501632 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 10:31:21 crc kubenswrapper[4761]: W1201 10:31:21.509969 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf56ffe_a6c0_446a_81db_deae9bd72c7c.slice/crio-0ff7015a38d3b9d4b65227778ff0f8fd9862fb446a3cdda0f573c6cc721e634a WatchSource:0}: Error finding container 0ff7015a38d3b9d4b65227778ff0f8fd9862fb446a3cdda0f573c6cc721e634a: Status 404 returned error can't find the container with id 0ff7015a38d3b9d4b65227778ff0f8fd9862fb446a3cdda0f573c6cc721e634a Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.512446 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.533285 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.545957 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.559521 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.577911 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.589753 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.593470 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-8sv24"] Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.594236 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nz6qt"] Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.594372 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.594966 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.596278 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.596352 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.597387 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pllhm"] Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.598430 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.598793 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.598989 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.599141 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.599243 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.599273 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.600927 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.601077 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.601410 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.602908 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.602931 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.602937 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.603208 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.603780 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.613437 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.613488 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.613624 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.613653 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.613599 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.613667 4761 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.613625 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.613706 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.613719 4761 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.613739 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:23.613722702 +0000 UTC m=+22.917481326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.613759 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:23.613748332 +0000 UTC m=+22.917506956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.635058 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.654598 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.663013 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.677801 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.701050 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714261 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714357 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-systemd-units\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714384 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-node-log\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714405 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714430 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-systemd\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714453 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-cni-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714474 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-etc-kubernetes\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714494 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b5zp\" (UniqueName: \"kubernetes.io/projected/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-kube-api-access-7b5zp\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714516 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-system-cni-dir\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714575 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714598 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovn-node-metrics-cert\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714629 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-k8s-cni-cncf-io\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714649 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-cni-bin\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714672 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-conf-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714692 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-netns\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714713 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-os-release\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714734 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-etc-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714759 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl8pq\" (UniqueName: \"kubernetes.io/projected/70f872ad-e694-4743-8269-72456cb8d037-kube-api-access-wl8pq\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714780 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-var-lib-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714803 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714828 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-env-overrides\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714863 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-os-release\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714883 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70f872ad-e694-4743-8269-72456cb8d037-cni-binary-copy\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714925 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-cni-multus\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714950 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714970 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-slash\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.714988 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-ovn\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715010 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-socket-dir-parent\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715029 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-kubelet\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715049 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-hostroot\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715068 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-daemon-config\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715088 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/70f872ad-e694-4743-8269-72456cb8d037-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715111 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-cni-binary-copy\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715130 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-system-cni-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715151 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-multus-certs\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715172 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-log-socket\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715194 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-script-lib\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715214 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96n87\" (UniqueName: \"kubernetes.io/projected/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-kube-api-access-96n87\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715236 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-kubelet\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715254 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-bin\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715275 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-cnibin\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715295 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-netns\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715315 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-config\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715334 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-cnibin\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715356 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715377 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-netd\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.715396 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.715498 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:23.715480689 +0000 UTC m=+23.019239313 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.715601 4761 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.715639 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:23.715629673 +0000 UTC m=+23.019388307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.715836 4761 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:21 crc kubenswrapper[4761]: E1201 10:31:21.715873 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:23.7158628 +0000 UTC m=+23.019621434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.725956 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.736619 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.748928 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.759535 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.771881 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816446 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-system-cni-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816505 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-multus-certs\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816528 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-log-socket\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816569 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-script-lib\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816588 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96n87\" (UniqueName: \"kubernetes.io/projected/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-kube-api-access-96n87\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816598 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-system-cni-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816661 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-netns\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816663 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-multus-certs\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816609 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-netns\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816714 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-kubelet\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816734 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-bin\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816754 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-cnibin\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816773 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-cnibin\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816793 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-config\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816812 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816835 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-netd\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816876 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816901 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-systemd-units\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816919 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-node-log\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816939 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816960 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-systemd\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.816997 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-cni-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817015 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-etc-kubernetes\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817037 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b5zp\" (UniqueName: \"kubernetes.io/projected/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-kube-api-access-7b5zp\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817057 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-system-cni-dir\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817085 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-k8s-cni-cncf-io\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817106 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-cni-bin\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817126 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovn-node-metrics-cert\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817147 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-conf-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817165 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-netns\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817187 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-os-release\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817241 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-etc-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817264 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl8pq\" (UniqueName: \"kubernetes.io/projected/70f872ad-e694-4743-8269-72456cb8d037-kube-api-access-wl8pq\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817288 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70f872ad-e694-4743-8269-72456cb8d037-cni-binary-copy\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817310 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-cni-multus\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817329 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-script-lib\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817332 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-var-lib-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817361 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-var-lib-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817394 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817424 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-env-overrides\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817444 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-os-release\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817486 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-slash\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817517 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-ovn\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817539 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/70f872ad-e694-4743-8269-72456cb8d037-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817577 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-cni-binary-copy\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817598 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-socket-dir-parent\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817617 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-kubelet\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817635 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-hostroot\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817650 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-daemon-config\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817673 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-kubelet\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817713 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-bin\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817744 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-cnibin\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.817792 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-cnibin\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818151 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-daemon-config\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818190 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-log-socket\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818213 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818421 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-config\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818469 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818497 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-env-overrides\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818505 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-netd\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818536 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818596 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-systemd-units\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818646 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-node-log\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818754 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-os-release\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818795 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-socket-dir-parent\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818814 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-kubelet\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818839 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-hostroot\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818859 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-ovn\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.818906 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-slash\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819166 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-cni-binary-copy\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819215 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-etc-kubernetes\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819254 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-run-k8s-cni-cncf-io\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819265 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-etc-openvswitch\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819325 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-os-release\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819281 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-system-cni-dir\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819338 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-cni-multus\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819335 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-netns\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819371 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-systemd\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819365 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-conf-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819352 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-host-var-lib-cni-bin\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819595 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-multus-cni-dir\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.819846 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70f872ad-e694-4743-8269-72456cb8d037-cni-binary-copy\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.820054 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/70f872ad-e694-4743-8269-72456cb8d037-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.848762 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovn-node-metrics-cert\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.848894 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96n87\" (UniqueName: \"kubernetes.io/projected/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-kube-api-access-96n87\") pod \"ovnkube-node-pllhm\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.848931 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b5zp\" (UniqueName: \"kubernetes.io/projected/7a9149d7-77b0-4df1-8d1a-5a94ef00463a-kube-api-access-7b5zp\") pod \"multus-nz6qt\" (UID: \"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\") " pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.851594 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl8pq\" (UniqueName: \"kubernetes.io/projected/70f872ad-e694-4743-8269-72456cb8d037-kube-api-access-wl8pq\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.916641 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nz6qt" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.924223 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.976162 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70f872ad-e694-4743-8269-72456cb8d037-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8sv24\" (UID: \"70f872ad-e694-4743-8269-72456cb8d037\") " pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:21 crc kubenswrapper[4761]: W1201 10:31:21.989171 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a9149d7_77b0_4df1_8d1a_5a94ef00463a.slice/crio-f282fd91c493bf942bf5c2edb53c4206b365b21cca1bc9ef04bc26ce5ada847b WatchSource:0}: Error finding container f282fd91c493bf942bf5c2edb53c4206b365b21cca1bc9ef04bc26ce5ada847b: Status 404 returned error can't find the container with id f282fd91c493bf942bf5c2edb53c4206b365b21cca1bc9ef04bc26ce5ada847b Dec 01 10:31:21 crc kubenswrapper[4761]: W1201 10:31:21.991117 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod463dbf7c_b2d9_4f91_819c_f74a30d5d01b.slice/crio-a824dc72377a6db821ea40beed6150d7a255b974a9baeddea434ee4b93b58e9e WatchSource:0}: Error finding container a824dc72377a6db821ea40beed6150d7a255b974a9baeddea434ee4b93b58e9e: Status 404 returned error can't find the container with id a824dc72377a6db821ea40beed6150d7a255b974a9baeddea434ee4b93b58e9e Dec 01 10:31:21 crc kubenswrapper[4761]: I1201 10:31:21.994210 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.006068 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.020207 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.047286 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.059280 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.067819 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.127988 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.128057 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.127996 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:22 crc kubenswrapper[4761]: E1201 10:31:22.128134 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:22 crc kubenswrapper[4761]: E1201 10:31:22.128225 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:22 crc kubenswrapper[4761]: E1201 10:31:22.128296 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.209199 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8sv24" Dec 01 10:31:22 crc kubenswrapper[4761]: W1201 10:31:22.241936 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f872ad_e694_4743_8269_72456cb8d037.slice/crio-c459cab68a51d2148e7be53cf35a5a2bc7cf7a1d28e09638d77f3d66ab815fac WatchSource:0}: Error finding container c459cab68a51d2148e7be53cf35a5a2bc7cf7a1d28e09638d77f3d66ab815fac: Status 404 returned error can't find the container with id c459cab68a51d2148e7be53cf35a5a2bc7cf7a1d28e09638d77f3d66ab815fac Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.313210 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jbqqz" event={"ID":"8b958982-d6cc-45e7-b3f4-1684bfa145bc","Type":"ContainerStarted","Data":"4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531"} Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.323708 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"a824dc72377a6db821ea40beed6150d7a255b974a9baeddea434ee4b93b58e9e"} Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.326566 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" event={"ID":"70f872ad-e694-4743-8269-72456cb8d037","Type":"ContainerStarted","Data":"c459cab68a51d2148e7be53cf35a5a2bc7cf7a1d28e09638d77f3d66ab815fac"} Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.327560 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nz6qt" event={"ID":"7a9149d7-77b0-4df1-8d1a-5a94ef00463a","Type":"ContainerStarted","Data":"f282fd91c493bf942bf5c2edb53c4206b365b21cca1bc9ef04bc26ce5ada847b"} Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.329666 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4"} Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.329696 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"0ff7015a38d3b9d4b65227778ff0f8fd9862fb446a3cdda0f573c6cc721e634a"} Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.338906 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: E1201 10:31:22.339192 4761 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.353221 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.370721 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.385262 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.408780 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.426442 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.435007 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.450335 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.462801 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.475747 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.487984 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.500373 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.512999 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:22 crc kubenswrapper[4761]: I1201 10:31:22.523577 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:22Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.337159 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb" exitCode=0 Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.337268 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb"} Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.339269 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5"} Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.340674 4761 generic.go:334] "Generic (PLEG): container finished" podID="70f872ad-e694-4743-8269-72456cb8d037" containerID="12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677" exitCode=0 Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.340723 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" event={"ID":"70f872ad-e694-4743-8269-72456cb8d037","Type":"ContainerDied","Data":"12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677"} Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.342412 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nz6qt" event={"ID":"7a9149d7-77b0-4df1-8d1a-5a94ef00463a","Type":"ContainerStarted","Data":"5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb"} Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.344236 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d"} Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.352475 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.372081 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.385267 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.408691 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.423035 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.444109 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.455940 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.471706 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.491453 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.503648 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.512920 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.529404 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.541598 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.552616 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.566577 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.578381 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.591453 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.603463 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.623470 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.634447 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.634494 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.634622 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.634640 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.634684 4761 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.634731 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:27.63471832 +0000 UTC m=+26.938476944 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.634781 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.634790 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.634797 4761 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.634817 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:27.634811433 +0000 UTC m=+26.938570057 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.653043 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.678379 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.709853 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.730744 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.735354 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.735438 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.735477 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.735568 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:27.73552505 +0000 UTC m=+27.039283674 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.735615 4761 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.735624 4761 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.735671 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:27.735659554 +0000 UTC m=+27.039418178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:23 crc kubenswrapper[4761]: E1201 10:31:23.735690 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:27.735682194 +0000 UTC m=+27.039440938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.743155 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.757040 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.770423 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.783409 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.798323 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.860169 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zx6x8"] Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.860661 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.863114 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.863434 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.864616 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.867335 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.875077 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.887108 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.905964 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.916427 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.926866 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.937337 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvprm\" (UniqueName: \"kubernetes.io/projected/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-kube-api-access-pvprm\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.937411 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-host\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.937447 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-serviceca\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.939315 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.951560 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.963561 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.975062 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.986783 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:23 crc kubenswrapper[4761]: I1201 10:31:23.998401 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.009570 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.020223 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.031242 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.038487 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-serviceca\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.038572 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvprm\" (UniqueName: \"kubernetes.io/projected/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-kube-api-access-pvprm\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.038626 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-host\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.038682 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-host\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.039465 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-serviceca\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.046394 4761 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.046930 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.047912 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.047944 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.047954 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.048048 4761 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.054695 4761 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.054907 4761 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.055743 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.055764 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.055772 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.055784 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.055748 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvprm\" (UniqueName: \"kubernetes.io/projected/cb7be4c9-95e2-452c-9c8d-6bc18b8ff387-kube-api-access-pvprm\") pod \"node-ca-zx6x8\" (UID: \"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\") " pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.055793 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.070884 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.074166 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.074199 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.074207 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.074222 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.074231 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.088443 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.091152 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.091186 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.091196 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.091211 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.091222 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.101242 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.103942 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.103971 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.103980 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.103993 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.104002 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.115011 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.118028 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.118070 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.118092 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.118109 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.118122 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.127668 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.127690 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.127780 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.127787 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.127875 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.127941 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.130170 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: E1201 10:31:24.130446 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.132077 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.132122 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.132133 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.132151 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.132171 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.178600 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zx6x8" Dec 01 10:31:24 crc kubenswrapper[4761]: W1201 10:31:24.191914 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb7be4c9_95e2_452c_9c8d_6bc18b8ff387.slice/crio-23607622f7510917c6a724cee892e8970e95f35e7baa7ee433b89502fffffbac WatchSource:0}: Error finding container 23607622f7510917c6a724cee892e8970e95f35e7baa7ee433b89502fffffbac: Status 404 returned error can't find the container with id 23607622f7510917c6a724cee892e8970e95f35e7baa7ee433b89502fffffbac Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.234559 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.234596 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.234628 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.234643 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.234652 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.336819 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.336850 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.336859 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.336871 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.336880 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.350871 4761 generic.go:334] "Generic (PLEG): container finished" podID="70f872ad-e694-4743-8269-72456cb8d037" containerID="9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d" exitCode=0 Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.350931 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" event={"ID":"70f872ad-e694-4743-8269-72456cb8d037","Type":"ContainerDied","Data":"9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.360977 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.361021 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.361034 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.361045 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.361054 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.361064 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.364302 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zx6x8" event={"ID":"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387","Type":"ContainerStarted","Data":"23607622f7510917c6a724cee892e8970e95f35e7baa7ee433b89502fffffbac"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.368562 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.381293 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.393355 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.415524 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.427628 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.438441 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.440195 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.440224 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.440236 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.440252 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.440264 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.449741 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.460589 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.479052 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.491672 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.502925 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.514201 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.527491 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.540434 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.542588 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.542628 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.542639 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.542652 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.542661 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.555274 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.644534 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.644592 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.644600 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.644616 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.644627 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.747045 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.747088 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.747098 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.747116 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.747126 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.849011 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.849050 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.849059 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.849075 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.849086 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.951565 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.951616 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.951628 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.951646 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:24 crc kubenswrapper[4761]: I1201 10:31:24.951657 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:24Z","lastTransitionTime":"2025-12-01T10:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.001804 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.005535 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.013134 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.026662 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.041109 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.053240 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.053690 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.053726 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.053736 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.053754 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.053764 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.070575 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.084765 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.095245 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.123831 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.142134 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.153084 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.155703 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.155802 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.155829 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.155863 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.155886 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.171418 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.204354 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.234040 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.257809 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.257837 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.257846 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.257859 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.257869 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.360446 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.361249 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.361329 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.361402 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.361467 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.397904 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.421390 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.434110 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.446382 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.459873 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.463187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.463210 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.463218 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.463229 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.463237 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.479644 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.492940 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.504413 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.516028 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.530322 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.544813 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.561874 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.565157 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.565195 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.565203 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.565216 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.565224 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.582783 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.644489 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.668180 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.668246 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.668261 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.668277 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.668287 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.672054 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.702884 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.745646 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:25Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.771153 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.771200 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.771212 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.771230 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.771242 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.874353 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.874698 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.874817 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.874902 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.874964 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.977005 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.977037 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.977045 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.977058 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:25 crc kubenswrapper[4761]: I1201 10:31:25.977065 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:25Z","lastTransitionTime":"2025-12-01T10:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.078710 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.078888 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.079000 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.079116 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.079225 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.127818 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.127881 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:26 crc kubenswrapper[4761]: E1201 10:31:26.127930 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.127823 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:26 crc kubenswrapper[4761]: E1201 10:31:26.128009 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:26 crc kubenswrapper[4761]: E1201 10:31:26.128079 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.181591 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.181890 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.182001 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.182089 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.182169 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.284397 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.284450 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.284490 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.284513 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.284528 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.373111 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.374172 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zx6x8" event={"ID":"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387","Type":"ContainerStarted","Data":"d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.377324 4761 generic.go:334] "Generic (PLEG): container finished" podID="70f872ad-e694-4743-8269-72456cb8d037" containerID="c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406" exitCode=0 Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.377398 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" event={"ID":"70f872ad-e694-4743-8269-72456cb8d037","Type":"ContainerDied","Data":"c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.386758 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.386801 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.386813 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.386831 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.386843 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.397400 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.411936 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.427253 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.445219 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.464734 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.478393 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.489685 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.489744 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.489758 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.489781 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.489795 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.492270 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.504336 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.516226 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.530228 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.540932 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.552611 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.567672 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.578486 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.588308 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.591353 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.591382 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.591391 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.591404 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.591433 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.598437 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.606713 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.619736 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.632963 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.645196 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.660217 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.680464 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.693656 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.693710 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.693723 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.693740 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.693754 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.697259 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.710367 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.744302 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.787692 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.796070 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.796103 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.796113 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.796126 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.796134 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.827095 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.869607 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.898925 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.898975 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.898986 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.898999 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.899008 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:26Z","lastTransitionTime":"2025-12-01T10:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.906743 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:26 crc kubenswrapper[4761]: I1201 10:31:26.959481 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.002189 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.002252 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.002264 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.002285 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.002299 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.104792 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.104831 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.104842 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.104859 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.104870 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.208530 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.208601 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.208615 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.208633 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.208644 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.311723 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.311802 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.311821 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.312247 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.312305 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.383858 4761 generic.go:334] "Generic (PLEG): container finished" podID="70f872ad-e694-4743-8269-72456cb8d037" containerID="9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e" exitCode=0 Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.384843 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" event={"ID":"70f872ad-e694-4743-8269-72456cb8d037","Type":"ContainerDied","Data":"9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.406421 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.415837 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.415906 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.415930 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.415960 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.415978 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.425703 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.452710 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.467453 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.479817 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.497331 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.514707 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.519109 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.519175 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.519477 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.519501 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.519513 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.539614 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.555898 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.580194 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.595801 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.615409 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.627014 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.635169 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.635199 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.635208 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.635222 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.635232 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.637159 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.654515 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.679119 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.679177 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.679323 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.679360 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.679372 4761 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.679424 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:35.679409333 +0000 UTC m=+34.983167957 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.679334 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.679790 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.679803 4761 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.679831 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:35.679823805 +0000 UTC m=+34.983582419 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.737878 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.737912 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.737920 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.737934 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.737943 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.779989 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.780106 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.780156 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.780268 4761 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.780323 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:35.780307125 +0000 UTC m=+35.084065749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.780613 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:35.780587854 +0000 UTC m=+35.084346478 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.780616 4761 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:27 crc kubenswrapper[4761]: E1201 10:31:27.780681 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:35.780672086 +0000 UTC m=+35.084430810 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.840915 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.840966 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.840981 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.841000 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.841015 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.943326 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.943359 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.943367 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.943381 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:27 crc kubenswrapper[4761]: I1201 10:31:27.943390 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:27Z","lastTransitionTime":"2025-12-01T10:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.046285 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.046327 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.046341 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.046362 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.046375 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.128257 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.128322 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:28 crc kubenswrapper[4761]: E1201 10:31:28.128403 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.128468 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:28 crc kubenswrapper[4761]: E1201 10:31:28.128593 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:28 crc kubenswrapper[4761]: E1201 10:31:28.128676 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.148956 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.149006 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.149023 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.149046 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.149064 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.252253 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.252317 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.252335 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.252360 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.252378 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.356158 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.356200 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.356210 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.356226 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.356236 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.395657 4761 generic.go:334] "Generic (PLEG): container finished" podID="70f872ad-e694-4743-8269-72456cb8d037" containerID="1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4" exitCode=0 Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.395769 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" event={"ID":"70f872ad-e694-4743-8269-72456cb8d037","Type":"ContainerDied","Data":"1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.417358 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.431676 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.447672 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.458746 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.458776 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.458785 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.458797 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.458808 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.467206 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.493581 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.513174 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.527417 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.539965 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.552128 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.561907 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.561946 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.561959 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.561976 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.561987 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.565179 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.582837 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.592426 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.608960 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.619942 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.630444 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:28Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.664008 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.664045 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.664053 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.664068 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.664077 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.766422 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.766464 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.766475 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.766489 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.766500 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.869241 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.869303 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.869320 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.869342 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.869357 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.973154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.973215 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.973239 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.973268 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:28 crc kubenswrapper[4761]: I1201 10:31:28.973290 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:28Z","lastTransitionTime":"2025-12-01T10:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.075730 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.075774 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.075789 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.075809 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.075821 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.177655 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.177697 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.177708 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.177726 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.177739 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.280475 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.280538 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.280592 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.280617 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.280634 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.383656 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.383729 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.383748 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.383774 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.383796 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.404850 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.405365 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.405414 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.409921 4761 generic.go:334] "Generic (PLEG): container finished" podID="70f872ad-e694-4743-8269-72456cb8d037" containerID="4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b" exitCode=0 Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.409995 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" event={"ID":"70f872ad-e694-4743-8269-72456cb8d037","Type":"ContainerDied","Data":"4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.421108 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.440246 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.459879 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.469093 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.469145 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.479656 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.486964 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.487015 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.487034 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.487071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.487088 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.494179 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.517825 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.539067 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.554853 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.574628 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.588161 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.590048 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.590066 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.590074 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.590087 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.590096 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.603463 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.615754 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.634796 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.652524 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.664209 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.681131 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.696479 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.697636 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.697682 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.697693 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.697707 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.697717 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.709955 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.726743 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.737480 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.750947 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.771507 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.787933 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.800780 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.800815 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.800827 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.800840 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.800850 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.804349 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.814347 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.825909 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.834790 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.849491 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.859680 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.869075 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:29Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.904116 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.904196 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.904225 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.904257 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:29 crc kubenswrapper[4761]: I1201 10:31:29.904276 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:29Z","lastTransitionTime":"2025-12-01T10:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.007463 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.007519 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.007536 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.007592 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.007610 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.111859 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.111936 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.111961 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.111994 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.112018 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.128008 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.128064 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:30 crc kubenswrapper[4761]: E1201 10:31:30.128134 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.128064 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:30 crc kubenswrapper[4761]: E1201 10:31:30.128236 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:30 crc kubenswrapper[4761]: E1201 10:31:30.128353 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.215809 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.215871 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.215891 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.215916 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.215934 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.318312 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.318351 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.318362 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.318375 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.318386 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.416317 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" event={"ID":"70f872ad-e694-4743-8269-72456cb8d037","Type":"ContainerStarted","Data":"a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.416413 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.419717 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.419757 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.419765 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.419778 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.419787 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.440296 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.450096 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.459423 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.469646 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.478353 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.511913 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.522188 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.522238 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.522249 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.522265 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.522277 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.526336 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.538462 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.551174 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.563874 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.575177 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.587605 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.608609 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.624053 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.624089 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.624097 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.624110 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.624119 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.629518 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.641636 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.729088 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.729153 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.729164 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.729200 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.729214 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.832394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.832454 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.832471 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.832498 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.832517 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.935666 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.935835 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.935855 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.935875 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:30 crc kubenswrapper[4761]: I1201 10:31:30.935889 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:30Z","lastTransitionTime":"2025-12-01T10:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.038812 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.038856 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.038870 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.038890 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.038903 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.139947 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.141605 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.141656 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.141672 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.141686 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.141698 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.149969 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.162421 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.176855 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.190772 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.201455 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.215172 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.237153 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.244995 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.245077 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.245092 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.245110 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.245121 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.252712 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.266442 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.278428 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.291005 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.302982 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.311639 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.332134 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.347278 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.347306 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.347317 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.347330 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.347339 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.420661 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/0.log" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.423506 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb" exitCode=1 Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.423580 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.425728 4761 scope.go:117] "RemoveContainer" containerID="e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.441340 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.449176 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.449201 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.449209 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.449221 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.449228 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.458307 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.472723 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.484069 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.494808 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.507643 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.523681 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.537739 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.548822 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.551779 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.551816 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.551830 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.551845 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.551856 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.557954 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.568435 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.579328 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.596220 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.604183 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.629184 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:31Z\\\",\\\"message\\\":\\\"I1201 10:31:31.028840 6035 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:31.029065 6035 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:31.029075 6035 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:31.029101 6035 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:31.029208 6035 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:31:31.029259 6035 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:31:31.029203 6035 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:31.029286 6035 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:31.029296 6035 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:31.029299 6035 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:31.029319 6035 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:31.029329 6035 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:31:31.029355 6035 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:31.029393 6035 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:31.029411 6035 factory.go:656] Stopping watch factory\\\\nI1201 10:31:31.029429 6035 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.653294 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.653321 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.653515 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.653530 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.653538 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.755255 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.755289 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.755299 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.755314 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.755323 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.858382 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.858448 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.858459 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.858472 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.858480 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.960372 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.960416 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.960427 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.960446 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:31 crc kubenswrapper[4761]: I1201 10:31:31.960459 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:31Z","lastTransitionTime":"2025-12-01T10:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.062307 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.062335 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.062343 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.062355 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.062363 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.127319 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.127372 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.127319 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:32 crc kubenswrapper[4761]: E1201 10:31:32.127470 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:32 crc kubenswrapper[4761]: E1201 10:31:32.127513 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:32 crc kubenswrapper[4761]: E1201 10:31:32.127627 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.164530 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.164606 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.164617 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.164633 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.164644 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.267254 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.267295 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.267308 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.267324 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.267335 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.369508 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.369595 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.369613 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.369632 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.369651 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.433107 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/0.log" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.441020 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.441190 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.458171 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.472178 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.472223 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.472240 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.472263 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.472281 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.475285 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.498308 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.513235 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.524836 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.541671 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.554211 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.565122 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.574460 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.574492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.574502 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.574516 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.574524 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.582421 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:31Z\\\",\\\"message\\\":\\\"I1201 10:31:31.028840 6035 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:31.029065 6035 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:31.029075 6035 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:31.029101 6035 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:31.029208 6035 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:31:31.029259 6035 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:31:31.029203 6035 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:31.029286 6035 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:31.029296 6035 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:31.029299 6035 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:31.029319 6035 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:31.029329 6035 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:31:31.029355 6035 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:31.029393 6035 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:31.029411 6035 factory.go:656] Stopping watch factory\\\\nI1201 10:31:31.029429 6035 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.593670 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.602579 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.612797 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.622722 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.632240 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.644344 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.678490 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.678602 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.678630 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.678659 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.678683 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.781969 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.782019 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.782035 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.782057 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.782072 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.885001 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.885051 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.885063 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.885079 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.885091 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.992028 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.992083 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.992101 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.992118 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:32 crc kubenswrapper[4761]: I1201 10:31:32.992131 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:32Z","lastTransitionTime":"2025-12-01T10:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.095496 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.095608 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.095629 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.095653 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.095670 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.199703 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.199793 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.199818 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.199856 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.199881 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.302578 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.302641 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.302657 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.302676 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.302689 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.404605 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.404649 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.404662 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.404678 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.404690 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.450865 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/1.log" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.451379 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/0.log" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.454210 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8" exitCode=1 Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.454247 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.454279 4761 scope.go:117] "RemoveContainer" containerID="e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.455306 4761 scope.go:117] "RemoveContainer" containerID="f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8" Dec 01 10:31:33 crc kubenswrapper[4761]: E1201 10:31:33.455461 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.473135 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.491205 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.507996 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.508069 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.508085 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.508106 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.508140 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.510905 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.525623 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.542896 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.554373 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.564757 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.576208 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.586425 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.600024 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.610611 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.610841 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.610922 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.611020 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.611111 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.614065 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.625812 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.642866 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:31Z\\\",\\\"message\\\":\\\"I1201 10:31:31.028840 6035 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:31.029065 6035 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:31.029075 6035 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:31.029101 6035 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:31.029208 6035 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:31:31.029259 6035 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:31:31.029203 6035 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:31.029286 6035 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:31.029296 6035 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:31.029299 6035 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:31.029319 6035 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:31.029329 6035 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:31:31.029355 6035 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:31.029393 6035 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:31.029411 6035 factory.go:656] Stopping watch factory\\\\nI1201 10:31:31.029429 6035 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.664989 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.676008 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.715329 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.715392 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.715419 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.715442 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.715458 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.818868 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.818963 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.818976 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.818996 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.819027 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.930589 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.930642 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.930658 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.930680 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:33 crc kubenswrapper[4761]: I1201 10:31:33.930700 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:33Z","lastTransitionTime":"2025-12-01T10:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.002936 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl"] Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.003776 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.013803 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.014863 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.028857 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.032982 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.033099 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.033187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.033274 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.033343 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.042100 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.054215 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.073571 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.087837 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.127853 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.127902 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.127957 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.127970 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.128199 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.128578 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.128681 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.136018 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.136231 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.136296 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.136361 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.136431 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.154634 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/70b7d821-1028-4cfc-8a6b-efd9142b60c3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.154912 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/70b7d821-1028-4cfc-8a6b-efd9142b60c3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.155023 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h9x6\" (UniqueName: \"kubernetes.io/projected/70b7d821-1028-4cfc-8a6b-efd9142b60c3-kube-api-access-5h9x6\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.155129 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/70b7d821-1028-4cfc-8a6b-efd9142b60c3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.157436 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.177499 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.189387 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.211967 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.224911 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.236110 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.238684 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.238719 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.238728 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.238741 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.238750 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.248913 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.256232 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/70b7d821-1028-4cfc-8a6b-efd9142b60c3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.256463 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/70b7d821-1028-4cfc-8a6b-efd9142b60c3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.256602 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h9x6\" (UniqueName: \"kubernetes.io/projected/70b7d821-1028-4cfc-8a6b-efd9142b60c3-kube-api-access-5h9x6\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.256706 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/70b7d821-1028-4cfc-8a6b-efd9142b60c3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.256873 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/70b7d821-1028-4cfc-8a6b-efd9142b60c3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.257153 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/70b7d821-1028-4cfc-8a6b-efd9142b60c3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.261479 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.269452 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/70b7d821-1028-4cfc-8a6b-efd9142b60c3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.272633 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.276455 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h9x6\" (UniqueName: \"kubernetes.io/projected/70b7d821-1028-4cfc-8a6b-efd9142b60c3-kube-api-access-5h9x6\") pod \"ovnkube-control-plane-749d76644c-jwhnl\" (UID: \"70b7d821-1028-4cfc-8a6b-efd9142b60c3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.290695 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e07df101664952339a0b67dd28bb230da9211ca6df2d53318905839b871da9fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:31Z\\\",\\\"message\\\":\\\"I1201 10:31:31.028840 6035 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:31.029065 6035 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:31.029075 6035 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:31.029101 6035 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:31.029208 6035 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:31:31.029259 6035 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:31:31.029203 6035 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:31.029286 6035 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:31.029296 6035 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:31.029299 6035 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:31.029319 6035 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:31.029329 6035 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:31:31.029355 6035 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:31.029393 6035 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:31.029411 6035 factory.go:656] Stopping watch factory\\\\nI1201 10:31:31.029429 6035 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.316075 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.342116 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.342189 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.342205 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.342228 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.342243 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.444594 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.444632 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.444642 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.444660 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.444671 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.460066 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" event={"ID":"70b7d821-1028-4cfc-8a6b-efd9142b60c3","Type":"ContainerStarted","Data":"6f9f71c431f8139966cf361ad1548a8a5da3764f71febd08ef248af86698ec13"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.460769 4761 scope.go:117] "RemoveContainer" containerID="f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.460973 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.487170 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.505920 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.517914 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.521909 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.521971 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.521981 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.522005 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.522017 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.535747 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.535879 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.541306 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.541334 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.541343 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.541357 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.541368 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.546879 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.554089 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.556986 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.557015 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.557024 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.557040 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.557052 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.558487 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.573497 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.577289 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.579066 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.579104 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.579117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.579133 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.579145 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.594250 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.594687 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.598788 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.598821 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.598830 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.598846 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.598855 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.608731 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.615209 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: E1201 10:31:34.615324 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.616987 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.617030 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.617042 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.617062 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.617074 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.622833 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.635807 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.648933 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.660777 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.678384 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.690907 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.702077 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.718617 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.718656 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.718667 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.718683 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.718695 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.820483 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.820522 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.820531 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.820590 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.820609 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.923594 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.923632 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.923644 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.923661 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:34 crc kubenswrapper[4761]: I1201 10:31:34.923671 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:34Z","lastTransitionTime":"2025-12-01T10:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.027048 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.027085 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.027096 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.027113 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.027124 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.130110 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.130161 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.130175 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.130193 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.130204 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.233093 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.233161 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.233184 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.233214 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.233232 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.336637 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.336695 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.336713 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.336739 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.336756 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.439376 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.439416 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.439426 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.439440 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.439450 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.465504 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" event={"ID":"70b7d821-1028-4cfc-8a6b-efd9142b60c3","Type":"ContainerStarted","Data":"19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.465590 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" event={"ID":"70b7d821-1028-4cfc-8a6b-efd9142b60c3","Type":"ContainerStarted","Data":"9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.467637 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/1.log" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.479888 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.491661 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.494402 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-86rp7"] Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.495513 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.495636 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.513525 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.525807 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.536191 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.543710 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.543787 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.543811 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.543837 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.543854 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.551894 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.566968 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.569435 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrkh\" (UniqueName: \"kubernetes.io/projected/65d0c868-c268-4723-9323-6937c06b4ea9-kube-api-access-mvrkh\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.569590 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.581680 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.595165 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.608641 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.630725 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.641834 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.646594 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.646625 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.646637 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.646654 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.646665 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.658301 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.670837 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.670874 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrkh\" (UniqueName: \"kubernetes.io/projected/65d0c868-c268-4723-9323-6937c06b4ea9-kube-api-access-mvrkh\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.671163 4761 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.671341 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs podName:65d0c868-c268-4723-9323-6937c06b4ea9 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:36.171299374 +0000 UTC m=+35.475058048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs") pod "network-metrics-daemon-86rp7" (UID: "65d0c868-c268-4723-9323-6937c06b4ea9") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.685257 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.699672 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrkh\" (UniqueName: \"kubernetes.io/projected/65d0c868-c268-4723-9323-6937c06b4ea9-kube-api-access-mvrkh\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.707855 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.722407 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.739980 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.749331 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.749461 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.749489 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.749518 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.749542 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.755715 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.768721 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.771656 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.771786 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.771807 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.771818 4761 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.771865 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:51.771851706 +0000 UTC m=+51.075610330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.772123 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.772220 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.772232 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.772239 4761 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.772268 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:51.772261398 +0000 UTC m=+51.076020022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.780215 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.789909 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.805966 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.816759 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.825866 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.844574 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.852073 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.852155 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.852181 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.852212 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.852237 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.860399 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.871381 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.872629 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.872728 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.872807 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.872948 4761 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.872992 4761 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.872948 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:51.872905343 +0000 UTC m=+51.176663977 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.873072 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:51.873053637 +0000 UTC m=+51.176812451 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: E1201 10:31:35.873089 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:51.873080968 +0000 UTC m=+51.176839842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.881848 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.894707 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.906774 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.920295 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.935770 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.951307 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.955283 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.955342 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.955359 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.955386 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:35 crc kubenswrapper[4761]: I1201 10:31:35.955398 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:35Z","lastTransitionTime":"2025-12-01T10:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.057678 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.057719 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.057732 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.057750 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.057762 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.128114 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.128189 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:36 crc kubenswrapper[4761]: E1201 10:31:36.128286 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:36 crc kubenswrapper[4761]: E1201 10:31:36.128373 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.128760 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:36 crc kubenswrapper[4761]: E1201 10:31:36.129086 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.160691 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.160957 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.161130 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.161275 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.161394 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.175338 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:36 crc kubenswrapper[4761]: E1201 10:31:36.175524 4761 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:36 crc kubenswrapper[4761]: E1201 10:31:36.175655 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs podName:65d0c868-c268-4723-9323-6937c06b4ea9 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:37.175631491 +0000 UTC m=+36.479390145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs") pod "network-metrics-daemon-86rp7" (UID: "65d0c868-c268-4723-9323-6937c06b4ea9") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.265121 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.265177 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.265194 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.265219 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.265238 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.368017 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.368132 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.368157 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.368185 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.368212 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.472175 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.472221 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.472232 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.472254 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.472268 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.575430 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.575479 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.575490 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.575509 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.575521 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.677419 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.677516 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.677529 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.677600 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.677614 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.780497 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.780623 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.780653 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.780680 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.780697 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.883134 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.883197 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.883218 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.883242 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.883259 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.987487 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.987893 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.987910 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.987933 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:36 crc kubenswrapper[4761]: I1201 10:31:36.987949 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:36Z","lastTransitionTime":"2025-12-01T10:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.090799 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.090852 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.090865 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.090887 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.090900 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.127999 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:37 crc kubenswrapper[4761]: E1201 10:31:37.128205 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.188292 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:37 crc kubenswrapper[4761]: E1201 10:31:37.188460 4761 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:37 crc kubenswrapper[4761]: E1201 10:31:37.188525 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs podName:65d0c868-c268-4723-9323-6937c06b4ea9 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:39.188504658 +0000 UTC m=+38.492263282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs") pod "network-metrics-daemon-86rp7" (UID: "65d0c868-c268-4723-9323-6937c06b4ea9") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.193627 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.193665 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.193673 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.193688 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.193697 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.296018 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.296081 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.296091 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.296104 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.296113 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.398832 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.398866 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.398876 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.398890 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.398899 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.501688 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.501777 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.501800 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.501832 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.501857 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.605535 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.605627 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.605647 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.605677 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.605701 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.681301 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.734051 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.734215 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.734242 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.734286 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.734303 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.734970 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.751204 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.766743 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.781569 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.798977 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.812725 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.826506 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.836709 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.836783 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.836811 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.836841 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.836863 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.849753 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.865457 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.882318 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.900338 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.912990 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.923215 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.935754 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.940232 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.940285 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.940302 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.940323 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.940338 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:37Z","lastTransitionTime":"2025-12-01T10:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.948650 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.959511 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:37 crc kubenswrapper[4761]: I1201 10:31:37.974097 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.042842 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.042907 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.042928 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.042954 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.042973 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.127959 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.127969 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:38 crc kubenswrapper[4761]: E1201 10:31:38.128185 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:38 crc kubenswrapper[4761]: E1201 10:31:38.128333 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.127984 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:38 crc kubenswrapper[4761]: E1201 10:31:38.128528 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.145595 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.145634 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.145719 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.145743 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.145757 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.247617 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.247657 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.247669 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.247685 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.247697 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.349792 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.349835 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.349850 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.349866 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.349878 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.452150 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.452200 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.452214 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.452235 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.452248 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.554516 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.555499 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.555704 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.555843 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.555969 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.658615 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.658643 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.658653 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.658666 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.658676 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.761749 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.762112 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.762322 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.762464 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.762628 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.865419 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.865676 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.865762 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.865852 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.865935 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.969181 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.969214 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.969224 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.969240 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:38 crc kubenswrapper[4761]: I1201 10:31:38.969250 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:38Z","lastTransitionTime":"2025-12-01T10:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.071945 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.071978 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.071987 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.072001 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.072012 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.127715 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:39 crc kubenswrapper[4761]: E1201 10:31:39.127907 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.175289 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.175619 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.175630 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.175644 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.175653 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.250178 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:39 crc kubenswrapper[4761]: E1201 10:31:39.250354 4761 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:39 crc kubenswrapper[4761]: E1201 10:31:39.250436 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs podName:65d0c868-c268-4723-9323-6937c06b4ea9 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:43.25041318 +0000 UTC m=+42.554171834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs") pod "network-metrics-daemon-86rp7" (UID: "65d0c868-c268-4723-9323-6937c06b4ea9") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.277796 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.277822 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.277830 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.277842 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.277852 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.381334 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.381747 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.382112 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.382413 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.382700 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.486161 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.486211 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.486224 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.486241 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.486256 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.588482 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.588787 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.588814 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.588846 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.588867 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.692627 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.692681 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.692699 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.692727 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.692765 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.794976 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.795251 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.795352 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.795432 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.795504 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.897521 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.897614 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.897632 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.897651 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.897662 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.999787 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.999820 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4761]: I1201 10:31:39.999830 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:39.999845 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:39.999855 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:39Z","lastTransitionTime":"2025-12-01T10:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.101742 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.101776 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.101787 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.101882 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.101899 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.128151 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.128150 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:40 crc kubenswrapper[4761]: E1201 10:31:40.128751 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.128273 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:40 crc kubenswrapper[4761]: E1201 10:31:40.128783 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:40 crc kubenswrapper[4761]: E1201 10:31:40.129065 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.205008 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.205081 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.205104 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.205146 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.205169 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.308799 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.308879 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.308902 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.308933 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.308954 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.412162 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.412239 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.412256 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.412282 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.412300 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.515703 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.515752 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.515768 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.515788 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.515802 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.619216 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.619272 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.619289 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.619313 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.619331 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.722543 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.722665 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.722694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.722725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.722748 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.826368 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.826446 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.826466 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.826499 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.826521 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.929852 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.929957 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.929971 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.929991 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:40 crc kubenswrapper[4761]: I1201 10:31:40.930002 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:40Z","lastTransitionTime":"2025-12-01T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.032096 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.032158 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.032176 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.032200 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.032217 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.127963 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:41 crc kubenswrapper[4761]: E1201 10:31:41.128326 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.137412 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.137878 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.137988 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.138089 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.138202 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.157652 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.176290 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.197687 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.213669 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.230169 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.241071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.241122 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.241134 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.241154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.241165 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.251580 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.266973 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.284613 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.306117 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.319055 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.328830 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.343099 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.343141 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.343152 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.343166 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.343177 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.348296 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.361192 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.376297 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.390137 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.406019 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.423368 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:41Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.445416 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.445478 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.445492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.445511 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.445525 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.547781 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.547860 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.547903 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.547926 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.547941 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.652147 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.652228 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.652251 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.652283 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.652309 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.755198 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.755262 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.755283 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.755310 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.755328 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.858716 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.859083 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.859243 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.859752 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.859911 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.963001 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.963350 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.963486 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.963632 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:41 crc kubenswrapper[4761]: I1201 10:31:41.963752 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:41Z","lastTransitionTime":"2025-12-01T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.065590 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.065825 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.065890 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.065956 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.066019 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.128404 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:42 crc kubenswrapper[4761]: E1201 10:31:42.128733 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.128521 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:42 crc kubenswrapper[4761]: E1201 10:31:42.128934 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.128455 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:42 crc kubenswrapper[4761]: E1201 10:31:42.129097 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.168694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.168748 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.168764 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.168782 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.168793 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.271534 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.271609 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.271626 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.271646 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.271659 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.375190 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.375254 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.375270 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.375286 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.375296 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.477587 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.477648 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.477660 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.477679 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.477693 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.579860 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.580346 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.580500 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.580622 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.580745 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.683022 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.683054 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.683062 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.683077 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.683085 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.786095 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.786165 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.786187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.786218 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.786240 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.889781 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.890276 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.890433 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.890647 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.890831 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.994146 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.994217 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.994246 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.994278 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:42 crc kubenswrapper[4761]: I1201 10:31:42.994301 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:42Z","lastTransitionTime":"2025-12-01T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.097248 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.097309 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.097330 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.097355 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.097372 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.128440 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:43 crc kubenswrapper[4761]: E1201 10:31:43.128750 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.200081 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.200131 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.200140 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.200159 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.200170 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.298756 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:43 crc kubenswrapper[4761]: E1201 10:31:43.298954 4761 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:43 crc kubenswrapper[4761]: E1201 10:31:43.299065 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs podName:65d0c868-c268-4723-9323-6937c06b4ea9 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:51.299037405 +0000 UTC m=+50.602796069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs") pod "network-metrics-daemon-86rp7" (UID: "65d0c868-c268-4723-9323-6937c06b4ea9") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.304448 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.304507 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.304535 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.304596 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.304631 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.408493 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.408544 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.408626 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.408661 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.408677 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.512053 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.512397 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.512631 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.512887 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.513123 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.615874 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.615917 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.615927 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.615944 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.615954 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.719468 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.719524 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.719542 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.719596 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.719615 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.822648 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.822685 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.822695 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.822712 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.822724 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.926184 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.926262 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.926299 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.926330 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:43 crc kubenswrapper[4761]: I1201 10:31:43.926353 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:43Z","lastTransitionTime":"2025-12-01T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.029155 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.029247 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.029270 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.029341 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.029383 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.127901 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.127907 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:44 crc kubenswrapper[4761]: E1201 10:31:44.128186 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:44 crc kubenswrapper[4761]: E1201 10:31:44.128226 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.128678 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:44 crc kubenswrapper[4761]: E1201 10:31:44.128811 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.131881 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.131933 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.131952 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.131968 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.131980 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.233677 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.233959 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.234026 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.234103 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.234165 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.336413 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.336663 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.336744 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.336856 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.336972 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.439510 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.439597 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.439620 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.439649 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.439671 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.542291 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.542348 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.542366 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.542392 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.542411 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.645856 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.645919 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.645941 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.645972 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.645993 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.749054 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.749118 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.749152 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.749178 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.749196 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.851855 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.851904 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.851921 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.851948 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.851964 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.949864 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.949941 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.949971 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.949999 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.950016 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:44 crc kubenswrapper[4761]: E1201 10:31:44.977418 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.982750 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.982848 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.982872 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.982904 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:44 crc kubenswrapper[4761]: I1201 10:31:44.982927 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:44Z","lastTransitionTime":"2025-12-01T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: E1201 10:31:45.004696 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.010720 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.010778 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.010834 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.010893 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.010912 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: E1201 10:31:45.033521 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.038366 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.038484 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.038506 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.038535 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.039035 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: E1201 10:31:45.061316 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.067326 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.067394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.067417 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.067443 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.067463 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: E1201 10:31:45.091393 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4761]: E1201 10:31:45.091649 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.094285 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.094362 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.094378 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.094404 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.094423 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.128869 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:45 crc kubenswrapper[4761]: E1201 10:31:45.129060 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.197151 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.197204 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.197222 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.197251 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.197266 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.299580 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.299622 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.299632 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.299648 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.299660 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.402291 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.402364 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.402378 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.402397 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.402413 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.504892 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.504933 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.504955 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.504985 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.505002 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.608236 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.608292 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.608325 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.608372 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.608398 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.710960 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.711051 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.711110 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.711138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.711156 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.813909 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.813939 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.813947 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.813979 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.813990 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.917523 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.917617 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.917634 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.917659 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:45 crc kubenswrapper[4761]: I1201 10:31:45.917675 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:45Z","lastTransitionTime":"2025-12-01T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.020633 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.020691 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.020705 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.020727 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.020742 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.122652 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.122726 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.122745 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.122772 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.122793 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.127811 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:46 crc kubenswrapper[4761]: E1201 10:31:46.127914 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.128168 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.128213 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:46 crc kubenswrapper[4761]: E1201 10:31:46.128328 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:46 crc kubenswrapper[4761]: E1201 10:31:46.128227 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.225995 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.226040 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.226053 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.226071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.226084 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.329231 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.329282 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.329298 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.329319 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.329332 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.432282 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.432353 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.432376 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.432406 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.432431 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.535067 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.535108 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.535120 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.535133 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.535142 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.637713 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.637794 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.637833 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.637864 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.637886 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.741138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.741183 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.741192 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.741209 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.741220 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.843870 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.843905 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.843929 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.843943 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.843952 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.947721 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.947809 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.947978 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.948015 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4761]: I1201 10:31:46.948038 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.050852 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.050929 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.050955 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.050982 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.051001 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.129386 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:47 crc kubenswrapper[4761]: E1201 10:31:47.129652 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.129947 4761 scope.go:117] "RemoveContainer" containerID="f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.154049 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.154096 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.154114 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.154134 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.154149 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.258513 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.258854 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.258873 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.258889 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.258900 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.361044 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.361080 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.361093 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.361109 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.361122 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.464812 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.464884 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.464902 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.464926 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.464943 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.568465 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.568609 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.568627 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.568650 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.568668 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.671300 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.671360 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.671377 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.671404 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.671422 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.774035 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.774067 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.774075 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.774087 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.774096 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.857044 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.876860 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.876949 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.876973 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.877006 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.877030 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.980275 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.980315 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.980326 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.980345 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4761]: I1201 10:31:47.980358 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.082953 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.082990 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.083000 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.083013 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.083023 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.128000 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.128000 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:48 crc kubenswrapper[4761]: E1201 10:31:48.128184 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:48 crc kubenswrapper[4761]: E1201 10:31:48.128226 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.128032 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:48 crc kubenswrapper[4761]: E1201 10:31:48.128330 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.185944 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.186021 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.186047 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.186080 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.186105 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.288345 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.288393 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.288402 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.288416 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.288425 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.390533 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.390627 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.390638 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.390660 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.390672 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.496882 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.496945 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.496960 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.496979 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.497009 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.518030 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/1.log" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.521683 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.522373 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.538033 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.553410 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.570893 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.594137 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.599736 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.599792 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.599809 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.599833 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.599849 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.609027 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.622484 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.649680 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.665079 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.679265 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.692057 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.702332 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.702372 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.702380 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.702394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.702405 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.704794 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.712821 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.731657 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.745921 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.778795 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.809940 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.810073 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.810086 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.810107 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.810120 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.819075 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.832360 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.913021 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.913093 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.913107 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.913125 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4761]: I1201 10:31:48.913137 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.016381 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.016455 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.016472 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.016945 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.016995 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.119875 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.119938 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.119958 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.119985 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.120001 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.128580 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:49 crc kubenswrapper[4761]: E1201 10:31:49.128790 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.222439 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.222506 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.222516 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.222540 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.222577 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.325760 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.325819 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.325840 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.325862 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.325880 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.429179 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.429242 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.429259 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.429282 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.429301 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.528136 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/2.log" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.529176 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/1.log" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.530896 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.530957 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.530968 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.530989 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.531001 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.533411 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc" exitCode=1 Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.533466 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.533512 4761 scope.go:117] "RemoveContainer" containerID="f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.534623 4761 scope.go:117] "RemoveContainer" containerID="005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc" Dec 01 10:31:49 crc kubenswrapper[4761]: E1201 10:31:49.534910 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.562276 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.581493 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.599781 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.614673 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.634343 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.634374 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.634384 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.634399 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.634410 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.637712 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.658278 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.679395 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.695365 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.718818 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.734291 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.737491 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.737541 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.737587 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.737610 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.737625 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.748429 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.759763 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.772372 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.784043 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.794314 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.803489 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.819767 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0c8b0029d1af559025c3312a197f4d6bb76dc66e0f08050c8d600d5a10292f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:32Z\\\",\\\"message\\\":\\\"val\\\\nI1201 10:31:32.359507 6193 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:32.359519 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359530 6193 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:32.359541 6193 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:32.359563 6193 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:32.359580 6193 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:32.359617 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359623 6193 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:31:32.359698 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.359827 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360000 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360162 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:32.360616 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.840913 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.840955 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.840968 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.840987 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.841001 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.943480 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.943527 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.943561 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.943588 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4761]: I1201 10:31:49.943606 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.046635 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.046797 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.046819 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.046844 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.046859 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.128355 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.128404 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:50 crc kubenswrapper[4761]: E1201 10:31:50.128465 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:50 crc kubenswrapper[4761]: E1201 10:31:50.128617 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.128355 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:50 crc kubenswrapper[4761]: E1201 10:31:50.128725 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.149434 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.149468 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.149477 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.149490 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.149499 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.252517 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.252871 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.253003 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.253140 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.253306 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.356361 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.356624 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.356686 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.356751 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.356807 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.459583 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.459654 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.459679 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.459711 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.459733 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.540003 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/2.log" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.544980 4761 scope.go:117] "RemoveContainer" containerID="005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc" Dec 01 10:31:50 crc kubenswrapper[4761]: E1201 10:31:50.545214 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.562196 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.562262 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.562279 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.562303 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.562320 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.568061 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.588280 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.603388 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.621467 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.640510 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.659016 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.664502 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.664594 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.664621 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.664653 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.664675 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.674119 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.703913 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.724976 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.743588 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.761031 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.767413 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.767459 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.767471 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.767487 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.767499 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.775973 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.785648 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.795575 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.805910 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.813669 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.821398 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.870062 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.870118 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.870137 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.870161 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.870178 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.973395 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.973471 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.973493 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.973522 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4761]: I1201 10:31:50.973545 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.076381 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.076469 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.076492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.076521 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.076590 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.128434 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.128619 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.143735 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.165502 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.179299 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.179338 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.179348 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.179386 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.179402 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.185230 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.199015 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.215800 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.227731 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.240220 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.252951 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.275930 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.281922 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.281967 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.281980 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.281997 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.282007 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.289677 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.299919 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.312065 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.322494 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.332905 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.347139 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.362005 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.376325 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.384376 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.384436 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.384454 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.384477 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.384493 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.390217 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.390364 4761 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.390444 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs podName:65d0c868-c268-4723-9323-6937c06b4ea9 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:07.390420279 +0000 UTC m=+66.694178903 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs") pod "network-metrics-daemon-86rp7" (UID: "65d0c868-c268-4723-9323-6937c06b4ea9") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.486361 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.486418 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.486430 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.486444 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.486454 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.588712 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.588813 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.588839 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.588910 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.588936 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.692006 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.692058 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.692070 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.692088 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.692102 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.793633 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.793713 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.793816 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.793846 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.793862 4761 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.793919 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:23.793899146 +0000 UTC m=+83.097657770 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.793920 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.793962 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.793986 4761 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.794069 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:23.79403779 +0000 UTC m=+83.097796454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.796106 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.796191 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.796218 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.796245 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.796263 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.894854 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.895016 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:32:23.894983144 +0000 UTC m=+83.198741798 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.895072 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.895163 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.895292 4761 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.895292 4761 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.895354 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:23.895339774 +0000 UTC m=+83.199098428 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: E1201 10:31:51.895391 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:23.895364305 +0000 UTC m=+83.199122969 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.899447 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.899504 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.899521 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.899576 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4761]: I1201 10:31:51.899594 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.002139 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.002193 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.002209 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.002230 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.002242 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.104864 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.105220 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.105371 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.105514 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.105695 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.128222 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.128257 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:52 crc kubenswrapper[4761]: E1201 10:31:52.128345 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.128432 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:52 crc kubenswrapper[4761]: E1201 10:31:52.128600 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:52 crc kubenswrapper[4761]: E1201 10:31:52.128800 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.208701 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.208792 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.208807 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.208833 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.208855 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.310690 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.310984 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.311081 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.311152 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.311218 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.413963 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.414004 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.414016 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.414032 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.414044 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.516270 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.516308 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.516319 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.516335 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.516345 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.619670 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.619749 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.619767 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.619796 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.619813 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.722660 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.722723 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.722738 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.722754 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.722766 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.825994 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.826055 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.826071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.826095 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.826111 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.929639 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.929710 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.929732 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.929758 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4761]: I1201 10:31:52.929778 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.032819 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.032890 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.032907 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.032931 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.032952 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.128451 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:53 crc kubenswrapper[4761]: E1201 10:31:53.128692 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.135141 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.135256 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.135334 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.135366 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.135428 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.237751 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.237824 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.237840 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.237864 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.237880 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.341289 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.341350 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.341373 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.341401 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.341423 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.444031 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.444095 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.444111 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.444132 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.444145 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.546693 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.546749 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.546766 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.546789 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.546812 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.649670 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.649794 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.649809 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.649828 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.649839 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.752624 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.752674 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.752684 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.752701 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.752714 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.856077 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.856356 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.856368 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.856389 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.856402 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.958817 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.958878 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.958890 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.958910 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4761]: I1201 10:31:53.958921 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.061654 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.061691 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.061700 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.061714 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.061723 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.127955 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.127960 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:54 crc kubenswrapper[4761]: E1201 10:31:54.128171 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.128000 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:54 crc kubenswrapper[4761]: E1201 10:31:54.128296 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:54 crc kubenswrapper[4761]: E1201 10:31:54.128390 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.163434 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.163468 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.163476 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.163490 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.163499 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.266990 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.267052 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.267071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.267099 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.267121 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.370006 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.370055 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.370071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.370093 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.370108 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.472323 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.472365 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.472375 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.472390 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.472403 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.574136 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.574177 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.574189 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.574204 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.574215 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.676885 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.676936 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.676951 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.676969 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.676983 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.779854 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.779903 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.779915 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.779937 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.779964 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.883396 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.883479 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.883492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.883517 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.883539 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.986975 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.987037 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.987054 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.987081 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4761]: I1201 10:31:54.987098 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.089686 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.089732 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.089748 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.089769 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.089786 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.128200 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:55 crc kubenswrapper[4761]: E1201 10:31:55.128378 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.192836 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.192876 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.192888 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.192907 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.192919 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.236887 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.236943 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.236964 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.236991 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.237014 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.237800 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.252953 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 10:31:55 crc kubenswrapper[4761]: E1201 10:31:55.258836 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.260626 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.264279 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.264338 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.264357 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.264378 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.264396 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.279856 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: E1201 10:31:55.284612 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.288368 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.288522 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.288684 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.288939 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.289190 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.301499 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: E1201 10:31:55.305061 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.310591 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.310789 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.310875 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.310962 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.311053 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: E1201 10:31:55.327839 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.333105 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.333180 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.333206 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.333236 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.333257 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.337147 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.357664 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: E1201 10:31:55.360681 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: E1201 10:31:55.360916 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.362915 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.362984 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.363003 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.363027 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.363044 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.375159 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.397605 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.411236 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.425621 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.439575 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.454271 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.465994 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.466027 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.466037 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.466049 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.466058 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.468018 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.483369 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.504086 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.520190 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.535850 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.550373 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.568128 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.568178 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.568189 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.568208 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.568220 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.671816 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.671863 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.671878 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.671897 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.671910 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.774934 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.774981 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.774993 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.775013 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.775024 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.878078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.878150 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.878187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.878219 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.878242 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.981305 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.981429 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.981449 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.981478 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4761]: I1201 10:31:55.981499 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.084155 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.084197 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.084208 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.084221 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.084229 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.128432 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.128516 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.128582 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:56 crc kubenswrapper[4761]: E1201 10:31:56.128714 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:56 crc kubenswrapper[4761]: E1201 10:31:56.128916 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:56 crc kubenswrapper[4761]: E1201 10:31:56.129044 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.186841 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.186908 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.186924 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.186949 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.186963 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.289784 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.289838 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.289851 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.289874 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.289886 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.392341 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.392391 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.392403 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.392424 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.392435 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.495159 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.495213 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.495251 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.495273 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.495287 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.598450 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.598613 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.598644 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.598677 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.598705 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.700879 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.700960 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.700977 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.700997 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.701014 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.803246 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.803322 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.803341 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.803370 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.803389 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.905763 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.905830 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.905841 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.905864 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4761]: I1201 10:31:56.905875 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.009295 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.009373 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.009392 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.009422 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.009443 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.112379 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.112434 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.112445 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.112461 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.112472 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.128164 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:57 crc kubenswrapper[4761]: E1201 10:31:57.128399 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.216113 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.216201 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.216232 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.216267 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.216291 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.319835 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.319923 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.319952 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.319988 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.320009 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.422964 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.423013 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.423024 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.423045 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.423091 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.527666 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.527711 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.527724 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.527750 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.527763 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.630665 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.630716 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.630728 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.630746 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.630758 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.733097 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.733139 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.733150 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.733184 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.733196 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.836068 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.836101 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.836112 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.836128 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.836139 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.939605 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.939660 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.939677 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.939698 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4761]: I1201 10:31:57.939715 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.042640 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.042721 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.042749 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.042791 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.042825 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.127667 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.127698 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.127777 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:58 crc kubenswrapper[4761]: E1201 10:31:58.127873 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:58 crc kubenswrapper[4761]: E1201 10:31:58.128014 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:58 crc kubenswrapper[4761]: E1201 10:31:58.128086 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.145521 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.145572 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.145585 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.145602 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.145614 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.248344 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.248420 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.248438 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.248467 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.248485 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.352289 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.352343 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.352357 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.352379 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.352392 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.455810 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.455846 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.455857 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.455872 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.455884 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.558674 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.558717 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.558728 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.558750 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.558760 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.661341 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.661370 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.661377 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.661393 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.661403 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.764304 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.764359 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.764378 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.764401 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.764417 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.867032 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.867078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.867086 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.867100 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.867109 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.970194 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.970238 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.970254 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.970272 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4761]: I1201 10:31:58.970284 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.072703 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.072771 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.072794 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.072816 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.072832 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.127826 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:31:59 crc kubenswrapper[4761]: E1201 10:31:59.127944 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.174785 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.174815 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.174822 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.174836 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.174845 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.277372 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.277440 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.277463 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.277486 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.277506 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.380675 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.380747 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.380764 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.380788 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.380807 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.483600 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.483646 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.483657 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.483672 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.483685 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.585802 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.585851 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.585868 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.585892 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.585910 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.689412 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.689490 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.689515 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.689545 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.689605 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.792800 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.792858 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.792872 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.792893 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.792907 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.895676 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.895731 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.895747 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.895771 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.895788 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.998131 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.998193 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.998209 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.998234 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4761]: I1201 10:31:59.998251 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.101527 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.101656 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.101676 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.101706 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.101726 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.128235 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.128281 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:00 crc kubenswrapper[4761]: E1201 10:32:00.128656 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.128689 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:00 crc kubenswrapper[4761]: E1201 10:32:00.128880 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:00 crc kubenswrapper[4761]: E1201 10:32:00.129053 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.205600 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.205664 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.205688 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.205715 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.205739 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.309597 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.309657 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.309674 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.309699 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.309720 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.413078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.413138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.413159 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.413184 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.413201 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.516704 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.516750 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.516759 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.516775 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.516786 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.620213 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.620286 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.620310 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.620333 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.620352 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.724032 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.724078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.724090 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.724105 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.724118 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.826947 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.826996 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.827007 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.827025 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.827036 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.929654 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.929714 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.929725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.929743 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4761]: I1201 10:32:00.929757 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.033173 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.033223 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.033238 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.033261 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.033311 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.128572 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:01 crc kubenswrapper[4761]: E1201 10:32:01.129028 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.129293 4761 scope.go:117] "RemoveContainer" containerID="005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc" Dec 01 10:32:01 crc kubenswrapper[4761]: E1201 10:32:01.129476 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.142033 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.142088 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.142117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.142145 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.142166 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.149937 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.164883 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.180538 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.194920 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.218347 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.237605 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.245475 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.245533 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.245562 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.245582 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.245596 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.253967 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.274094 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.292681 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.307671 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.324871 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.348031 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.348341 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.348355 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.348373 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.348384 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.348309 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.366474 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.380630 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.396602 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.412099 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.423648 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.443593 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.451857 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.452182 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.452324 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.452613 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.452752 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.554905 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.554954 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.554963 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.554978 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.554987 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.657004 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.657221 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.657311 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.657377 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.657452 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.760025 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.760092 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.760123 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.760138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.760148 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.863594 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.863642 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.863655 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.863672 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.863686 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.967075 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.967117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.967128 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.967143 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4761]: I1201 10:32:01.967155 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.070530 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.070653 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.070722 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.070754 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.070811 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.127497 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:02 crc kubenswrapper[4761]: E1201 10:32:02.127768 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.127536 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.127511 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:02 crc kubenswrapper[4761]: E1201 10:32:02.127891 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:02 crc kubenswrapper[4761]: E1201 10:32:02.128033 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.173886 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.173936 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.173953 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.173976 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.173993 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.277436 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.277473 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.277481 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.277495 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.277505 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.380021 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.380078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.380116 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.380140 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.380157 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.483331 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.483492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.483515 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.483540 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.483600 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.585893 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.585937 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.585948 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.585966 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.585978 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.687935 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.687968 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.687976 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.687991 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.687999 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.791108 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.791163 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.791175 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.791195 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.791208 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.893632 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.893665 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.893678 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.893696 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.893707 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.996390 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.996645 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.996736 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.996820 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4761]: I1201 10:32:02.996905 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.099769 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.100115 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.100307 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.100510 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.100749 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.127821 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:03 crc kubenswrapper[4761]: E1201 10:32:03.128169 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.204394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.204454 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.204472 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.204496 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.204512 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.308388 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.308874 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.309404 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.309769 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.310037 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.413225 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.413310 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.413328 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.413347 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.413360 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.516668 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.516737 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.516755 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.516782 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.516799 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.620460 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.620496 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.620506 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.620520 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.620530 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.723312 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.723788 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.724013 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.724233 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.724479 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.829154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.829216 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.829228 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.829252 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.829268 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.931648 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.931729 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.931759 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.931788 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4761]: I1201 10:32:03.931805 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.033786 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.033850 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.033863 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.033880 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.033894 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.127996 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.128032 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.128097 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:04 crc kubenswrapper[4761]: E1201 10:32:04.128158 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:04 crc kubenswrapper[4761]: E1201 10:32:04.128236 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:04 crc kubenswrapper[4761]: E1201 10:32:04.128312 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.136442 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.136496 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.136508 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.136525 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.136542 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.239900 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.239944 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.239957 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.239974 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.239986 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.343232 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.343347 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.343372 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.343862 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.344077 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.446087 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.446122 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.446133 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.446146 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.446157 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.548455 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.548519 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.548535 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.548591 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.548608 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.651258 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.651325 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.651342 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.651888 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.651931 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.755531 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.755588 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.755596 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.755610 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.755621 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.858532 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.858750 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.858772 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.859173 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.859398 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.961940 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.961983 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.961999 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.962021 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4761]: I1201 10:32:04.962037 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.064456 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.064484 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.064492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.064504 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.064512 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.128331 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:05 crc kubenswrapper[4761]: E1201 10:32:05.128472 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.166893 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.166933 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.166946 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.166962 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.166972 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.269187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.269251 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.269269 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.269294 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.269313 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.372713 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.372793 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.372810 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.372835 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.372852 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.419263 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.419314 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.419325 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.419340 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.419348 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: E1201 10:32:05.432711 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.438058 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.438108 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.438120 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.438138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.438150 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: E1201 10:32:05.451792 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.455725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.455813 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.455823 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.455840 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.455850 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: E1201 10:32:05.468214 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.472755 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.472814 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.472825 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.472845 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.472858 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: E1201 10:32:05.490528 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.495656 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.495696 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.495707 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.495726 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.495739 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: E1201 10:32:05.509534 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:05 crc kubenswrapper[4761]: E1201 10:32:05.509794 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.511940 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.512014 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.512027 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.512045 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.512057 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.614006 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.614071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.614083 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.614101 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.614114 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.721463 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.722186 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.722224 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.722253 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.722277 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.825533 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.825605 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.825619 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.825637 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.825650 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.928509 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.928591 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.928604 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.928620 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4761]: I1201 10:32:05.928631 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.030967 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.031010 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.031020 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.031036 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.031046 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.128345 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.128391 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:06 crc kubenswrapper[4761]: E1201 10:32:06.128515 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.128569 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:06 crc kubenswrapper[4761]: E1201 10:32:06.128685 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:06 crc kubenswrapper[4761]: E1201 10:32:06.128748 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.133303 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.133337 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.133353 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.133371 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.133387 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.235713 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.235777 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.235800 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.235825 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.235846 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.338039 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.338093 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.338105 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.338123 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.338135 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.440914 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.440956 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.440965 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.440979 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.440987 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.543476 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.543704 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.543730 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.543754 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.543848 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.646605 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.646652 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.646670 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.646690 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.646705 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.749462 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.749519 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.749544 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.749576 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.749586 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.851926 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.852187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.852284 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.852386 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.852447 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.955509 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.955579 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.955596 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.955619 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4761]: I1201 10:32:06.955636 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.058382 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.058429 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.058444 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.058466 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.058484 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.128415 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:07 crc kubenswrapper[4761]: E1201 10:32:07.128826 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.161205 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.161238 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.161249 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.161265 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.161277 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.263057 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.263144 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.263157 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.263175 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.263188 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.365960 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.366260 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.366385 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.366496 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.366613 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.467304 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:07 crc kubenswrapper[4761]: E1201 10:32:07.467506 4761 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:07 crc kubenswrapper[4761]: E1201 10:32:07.467594 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs podName:65d0c868-c268-4723-9323-6937c06b4ea9 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:39.467572829 +0000 UTC m=+98.771331493 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs") pod "network-metrics-daemon-86rp7" (UID: "65d0c868-c268-4723-9323-6937c06b4ea9") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.468814 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.468845 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.468864 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.468882 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.468893 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.571458 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.571598 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.571613 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.571639 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.571653 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.674530 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.675162 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.675229 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.675312 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.675423 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.777848 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.777885 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.777894 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.777910 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.777920 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.880874 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.880924 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.880944 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.880963 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.880974 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.984130 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.984189 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.984201 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.984219 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4761]: I1201 10:32:07.984241 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.087138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.087195 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.087210 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.087229 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.087241 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.127950 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.127974 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:08 crc kubenswrapper[4761]: E1201 10:32:08.128126 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:08 crc kubenswrapper[4761]: E1201 10:32:08.128237 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.127974 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:08 crc kubenswrapper[4761]: E1201 10:32:08.128459 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.189774 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.189826 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.189837 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.189854 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.189866 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.292668 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.292717 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.292731 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.292750 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.292766 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.396076 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.396121 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.396134 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.396151 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.396163 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.498689 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.498735 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.498746 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.498764 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.498780 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.601585 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.601650 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.601670 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.601696 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.601713 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.716640 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.716877 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.716890 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.716908 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.716923 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.819711 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.819776 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.819793 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.819818 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.819834 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.922629 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.922689 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.922699 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.922722 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4761]: I1201 10:32:08.922738 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.025050 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.025096 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.025107 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.025123 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.025137 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.127502 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:09 crc kubenswrapper[4761]: E1201 10:32:09.127732 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.128022 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.128069 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.128110 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.128132 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.128145 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.230990 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.231043 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.231053 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.231071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.231085 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.333413 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.333462 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.333477 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.333492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.333502 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.436166 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.436218 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.436230 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.436247 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.436261 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.538402 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.538447 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.538459 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.538479 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.538490 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.604129 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/0.log" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.604182 4761 generic.go:334] "Generic (PLEG): container finished" podID="7a9149d7-77b0-4df1-8d1a-5a94ef00463a" containerID="5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb" exitCode=1 Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.604215 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nz6qt" event={"ID":"7a9149d7-77b0-4df1-8d1a-5a94ef00463a","Type":"ContainerDied","Data":"5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.604683 4761 scope.go:117] "RemoveContainer" containerID="5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.629740 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.640243 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.640287 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.640300 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.640318 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.640330 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.648079 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.663960 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.679512 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.700861 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.713679 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.727577 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:08Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb\\\\n2025-12-01T10:31:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb to /host/opt/cni/bin/\\\\n2025-12-01T10:31:23Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:23Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.778619 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.778673 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.778690 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.778714 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.778727 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.782160 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.793479 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.810932 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.825440 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.837782 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.852164 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.864699 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.878756 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.882573 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.882619 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.882645 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.882703 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.882717 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.898345 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.913893 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.932584 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.985385 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.985446 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.985459 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.985474 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4761]: I1201 10:32:09.985485 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.087655 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.087693 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.087730 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.087748 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.087759 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.128125 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:10 crc kubenswrapper[4761]: E1201 10:32:10.128289 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.128614 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:10 crc kubenswrapper[4761]: E1201 10:32:10.128683 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.128763 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:10 crc kubenswrapper[4761]: E1201 10:32:10.128889 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.189881 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.189920 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.189930 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.189946 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.189957 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.291985 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.292040 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.292053 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.292071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.292084 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.394683 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.394737 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.394751 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.394771 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.394785 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.497062 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.497114 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.497127 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.497148 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.497161 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.599616 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.599664 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.599677 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.599694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.599707 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.610052 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/0.log" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.610124 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nz6qt" event={"ID":"7a9149d7-77b0-4df1-8d1a-5a94ef00463a","Type":"ContainerStarted","Data":"4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.624935 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.637388 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.660396 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.674280 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.687625 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.699198 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.702089 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.702129 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.702138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.702153 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.702161 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.711674 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.725083 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.738739 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.751250 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.766089 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.789646 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.804807 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.804847 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.804858 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.804874 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.804888 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.810861 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.826612 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.841628 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.854310 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.865863 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.881128 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:08Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb\\\\n2025-12-01T10:31:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb to /host/opt/cni/bin/\\\\n2025-12-01T10:31:23Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:23Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:10Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.907460 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.907724 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.907767 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.907794 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4761]: I1201 10:32:10.907813 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.010062 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.010138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.010161 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.010193 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.010215 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.113002 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.113055 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.113069 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.113088 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.113099 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.127820 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:11 crc kubenswrapper[4761]: E1201 10:32:11.127973 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.143196 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.157350 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.171008 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.183334 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.204004 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.216207 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.216260 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.216274 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.216293 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.216634 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.220005 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.234804 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.250929 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.271458 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.285568 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.297718 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.309396 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.318942 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.318982 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.318992 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.319008 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.319021 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.320802 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.332394 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.345184 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:08Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb\\\\n2025-12-01T10:31:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb to /host/opt/cni/bin/\\\\n2025-12-01T10:31:23Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:23Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.359258 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.370407 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.388662 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.421488 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.421527 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.421539 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.421579 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.421592 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.524019 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.524067 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.524077 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.524094 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.524107 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.626502 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.626620 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.626638 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.626663 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.626683 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.728795 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.728847 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.728859 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.728876 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.728888 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.831476 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.831526 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.831537 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.831568 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.831583 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.934081 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.934363 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.934447 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.934536 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4761]: I1201 10:32:11.934646 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.037067 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.037111 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.037122 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.037141 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.037154 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.128016 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.128056 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.128023 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:12 crc kubenswrapper[4761]: E1201 10:32:12.128142 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:12 crc kubenswrapper[4761]: E1201 10:32:12.128291 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:12 crc kubenswrapper[4761]: E1201 10:32:12.128366 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.139501 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.139532 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.139541 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.139570 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.139579 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.241573 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.241614 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.241624 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.241641 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.241651 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.344198 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.344243 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.344257 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.344274 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.344286 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.447051 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.447092 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.447106 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.447122 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.447134 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.550436 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.550488 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.550499 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.550519 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.550529 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.653242 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.653299 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.653318 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.653404 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.653464 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.756007 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.756064 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.756076 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.756095 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.756108 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.858367 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.858405 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.858415 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.858428 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.858437 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.961065 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.961152 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.961167 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.961184 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4761]: I1201 10:32:12.961196 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.063600 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.063653 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.063665 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.063683 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.063695 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.128425 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:13 crc kubenswrapper[4761]: E1201 10:32:13.128664 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.165832 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.165879 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.165891 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.165909 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.165922 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.268722 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.268764 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.268777 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.268793 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.268804 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.370898 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.370958 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.370975 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.370999 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.371014 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.473498 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.473572 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.473588 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.473606 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.473618 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.576160 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.576218 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.576237 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.576259 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.576275 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.678457 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.678502 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.678512 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.678526 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.678536 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.781150 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.781216 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.781240 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.781271 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.781292 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.883108 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.883150 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.883162 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.883177 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.883187 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.985101 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.985149 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.985162 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.985181 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4761]: I1201 10:32:13.985195 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.087857 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.087900 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.087908 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.087922 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.087933 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.127950 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.128104 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:14 crc kubenswrapper[4761]: E1201 10:32:14.128241 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.128294 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:14 crc kubenswrapper[4761]: E1201 10:32:14.128418 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:14 crc kubenswrapper[4761]: E1201 10:32:14.128975 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.129465 4761 scope.go:117] "RemoveContainer" containerID="005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.190055 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.190098 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.190109 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.190127 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.190140 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.292835 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.292905 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.292925 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.292951 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.292969 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.395562 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.395643 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.395654 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.395669 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.395680 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.498390 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.498427 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.498439 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.498452 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.498461 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.601101 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.601144 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.601154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.601169 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.601177 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.623439 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/2.log" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.626078 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.626586 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.637149 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.658862 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.672533 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.684587 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.696701 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.703065 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.703097 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.703106 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.703118 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.703135 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.708778 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.719734 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.734325 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.749219 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.762180 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.775181 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.789120 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.801159 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.805095 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.805128 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.805139 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.805155 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.805168 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.812144 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.844380 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.862492 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:08Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb\\\\n2025-12-01T10:31:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb to /host/opt/cni/bin/\\\\n2025-12-01T10:31:23Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:23Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.888848 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.907442 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.907482 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.907506 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.907520 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.907529 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4761]: I1201 10:32:14.910026 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.009611 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.009682 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.009695 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.009714 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.009727 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.112385 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.112421 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.112434 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.112449 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.112461 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.128250 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:15 crc kubenswrapper[4761]: E1201 10:32:15.128415 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.214915 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.214963 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.214974 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.214992 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.215004 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.317724 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.317821 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.317857 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.317891 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.317913 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.420213 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.420264 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.420276 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.420293 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.420306 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.523891 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.523967 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.523987 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.524027 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.524261 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.626959 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.627011 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.627023 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.627043 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.627054 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.633447 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/3.log" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.634170 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/2.log" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.636695 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30" exitCode=1 Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.636745 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.636821 4761 scope.go:117] "RemoveContainer" containerID="005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.637431 4761 scope.go:117] "RemoveContainer" containerID="f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30" Dec 01 10:32:15 crc kubenswrapper[4761]: E1201 10:32:15.637608 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.652830 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.665425 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.695046 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://005cb3889b395855c7b95cca204be0a70108b94894b6c784c188df34768961dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.728500 6413 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728640 6413 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:31:48.728831 6413 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 10:31:48.729313 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:48.729388 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:48.729402 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:48.729408 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:48.729430 6413 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:48.729476 6413 factory.go:656] Stopping watch factory\\\\nI1201 10:31:48.729496 6413 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:31:48.729542 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:48.729593 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-etcd/etcd-crc\\\\nI1201 10:32:14.962026 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1201 10:32:14.962164 6781 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1201 10:32:14.962197 6781 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1201 10:32:14.962213 6781 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expir\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.706586 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.715867 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.727573 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.730036 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.730072 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.730087 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.730107 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.730121 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.741600 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.754843 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.766118 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.780297 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.796270 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.813741 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.831485 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.833382 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.833460 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.833478 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.833502 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.833518 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.847597 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.859577 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.870193 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.873618 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.873659 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.873670 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.873686 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.873697 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.881350 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: E1201 10:32:15.884158 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.887291 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.887322 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.887333 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.887349 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.887359 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.893134 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:08Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb\\\\n2025-12-01T10:31:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb to /host/opt/cni/bin/\\\\n2025-12-01T10:31:23Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:23Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: E1201 10:32:15.897878 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.901248 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.901406 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.901508 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.901644 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.901750 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: E1201 10:32:15.914469 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.918198 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.918266 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.918314 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.918335 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.918347 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: E1201 10:32:15.931690 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.935694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.935730 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.935739 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.935755 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.935767 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4761]: E1201 10:32:15.951900 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4761]: E1201 10:32:15.952306 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.953956 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.954001 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.954010 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.954023 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4761]: I1201 10:32:15.954031 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.057265 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.057337 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.057355 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.057378 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.057395 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.128208 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.128232 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.128337 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:16 crc kubenswrapper[4761]: E1201 10:32:16.128605 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:16 crc kubenswrapper[4761]: E1201 10:32:16.128727 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:16 crc kubenswrapper[4761]: E1201 10:32:16.128925 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.160798 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.160867 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.160894 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.160924 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.160946 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.263956 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.264300 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.264474 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.264725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.264884 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.367134 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.367175 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.367186 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.367197 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.367206 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.469942 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.470334 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.470635 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.470789 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.470905 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.573795 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.573842 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.573854 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.573873 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.573885 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.641282 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/3.log" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.646712 4761 scope.go:117] "RemoveContainer" containerID="f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30" Dec 01 10:32:16 crc kubenswrapper[4761]: E1201 10:32:16.646925 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.668664 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:08Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb\\\\n2025-12-01T10:31:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb to /host/opt/cni/bin/\\\\n2025-12-01T10:31:23Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:23Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.676964 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.676993 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.677001 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.677014 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.677023 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.702584 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.722711 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.741979 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.761002 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.773752 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.779239 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.779298 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.779317 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.779345 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.779362 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.791820 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.813439 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.830535 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.859035 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-etcd/etcd-crc\\\\nI1201 10:32:14.962026 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1201 10:32:14.962164 6781 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1201 10:32:14.962197 6781 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1201 10:32:14.962213 6781 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expir\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.873838 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.882746 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.883128 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.883422 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.883651 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.883837 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.887072 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.901658 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.915340 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.929052 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.942271 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.956887 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.973238 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.986323 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.986368 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.986380 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.986397 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4761]: I1201 10:32:16.986409 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.089764 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.089851 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.089867 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.089886 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.089897 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.127849 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:17 crc kubenswrapper[4761]: E1201 10:32:17.127988 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.192031 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.192131 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.192145 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.192163 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.192177 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.294803 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.294869 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.294886 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.294908 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.294925 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.397694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.397746 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.397762 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.397787 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.397803 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.500075 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.500151 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.500174 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.500204 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.500227 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.602721 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.602789 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.602805 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.602823 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.602835 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.705124 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.705158 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.705167 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.705181 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.705190 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.808383 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.808428 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.808442 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.808460 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.808471 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.912141 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.912220 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.912244 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.912273 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4761]: I1201 10:32:17.912296 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.015792 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.015949 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.015969 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.015993 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.016050 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.119191 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.119266 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.119289 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.119326 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.119350 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.127610 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.127731 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.127731 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:18 crc kubenswrapper[4761]: E1201 10:32:18.127898 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:18 crc kubenswrapper[4761]: E1201 10:32:18.128034 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:18 crc kubenswrapper[4761]: E1201 10:32:18.128249 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.222482 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.222588 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.222618 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.222652 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.222675 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.325751 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.325819 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.325851 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.325889 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.325911 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.429522 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.429648 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.429681 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.429713 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.429733 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.532408 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.532456 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.532471 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.532490 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.532505 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.635426 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.635492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.635512 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.635537 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.635595 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.738758 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.738836 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.738859 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.738884 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.738901 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.842019 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.842130 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.842147 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.842171 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.842188 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.945121 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.945162 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.945170 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.945183 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4761]: I1201 10:32:18.945193 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.048332 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.048366 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.048374 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.048388 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.048407 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.127529 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:19 crc kubenswrapper[4761]: E1201 10:32:19.127717 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.151059 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.151175 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.151203 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.151233 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.151260 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.253664 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.253710 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.253718 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.253732 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.253742 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.356532 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.356636 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.356652 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.356673 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.356686 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.459478 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.459529 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.459543 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.459586 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.459596 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.562766 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.562818 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.562835 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.562857 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.562872 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.665692 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.665767 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.665791 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.665820 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.665842 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.768076 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.768389 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.768397 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.768410 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.768424 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.870279 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.870318 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.870327 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.870340 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.870348 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.973523 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.973630 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.973651 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.973708 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4761]: I1201 10:32:19.973731 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.077141 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.077274 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.077319 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.077345 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.077363 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.128060 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.128144 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.128082 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:20 crc kubenswrapper[4761]: E1201 10:32:20.128279 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:20 crc kubenswrapper[4761]: E1201 10:32:20.128407 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:20 crc kubenswrapper[4761]: E1201 10:32:20.128629 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.180318 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.180367 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.180379 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.180396 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.180408 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.282969 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.283036 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.283054 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.283078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.283094 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.385669 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.385737 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.385751 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.385791 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.385804 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.489625 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.489689 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.489708 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.489731 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.489747 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.592705 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.592767 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.592778 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.592814 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.592828 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.696069 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.696142 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.696154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.696182 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.696198 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.798725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.798814 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.798827 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.798868 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.798883 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.901423 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.901481 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.901498 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.901522 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4761]: I1201 10:32:20.901538 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.004710 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.004755 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.004765 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.004782 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.004794 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.108532 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.108584 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.108596 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.108613 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.108626 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.128489 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:21 crc kubenswrapper[4761]: E1201 10:32:21.128696 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.152895 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.171923 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.190352 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.209081 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.211949 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.211992 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.212004 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.212026 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.212038 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.222903 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.240260 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.252731 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.265634 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:08Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb\\\\n2025-12-01T10:31:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb to /host/opt/cni/bin/\\\\n2025-12-01T10:31:23Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:23Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.290217 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.306708 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.314777 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.314823 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.314837 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.314856 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.314871 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.321739 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.344833 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-etcd/etcd-crc\\\\nI1201 10:32:14.962026 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1201 10:32:14.962164 6781 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1201 10:32:14.962197 6781 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1201 10:32:14.962213 6781 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expir\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.361817 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.374710 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.387100 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.399806 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.414330 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.417104 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.417129 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.417139 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.417152 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.417161 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.428997 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:21Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.519797 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.519840 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.519852 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.519868 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.519880 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.622179 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.622235 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.622247 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.622266 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.622277 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.725891 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.725937 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.725950 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.725969 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.725982 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.827886 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.827925 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.827936 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.827952 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.827963 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.930606 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.930691 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.930716 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.930747 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4761]: I1201 10:32:21.930772 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.033349 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.033381 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.033389 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.033403 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.033412 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.127858 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.127935 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.127994 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:22 crc kubenswrapper[4761]: E1201 10:32:22.128119 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:22 crc kubenswrapper[4761]: E1201 10:32:22.128216 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:22 crc kubenswrapper[4761]: E1201 10:32:22.128355 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.136179 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.136253 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.136274 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.136298 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.136317 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.238543 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.238598 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.238609 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.238628 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.238642 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.341219 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.341256 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.341268 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.341306 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.341319 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.443313 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.443400 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.443476 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.443498 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.443510 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.546787 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.546826 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.546837 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.546856 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.546872 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.649132 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.649210 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.649227 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.649251 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.649268 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.755487 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.756257 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.756315 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.756340 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.756353 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.859336 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.859368 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.859400 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.859414 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.859422 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.961408 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.961443 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.961452 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.961464 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4761]: I1201 10:32:22.961474 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.064694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.064757 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.064778 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.064821 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.064844 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.128274 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.128427 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.167303 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.167346 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.167356 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.167372 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.167382 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.270700 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.270771 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.270793 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.270824 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.270845 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.373076 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.373117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.373127 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.373142 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.373153 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.475802 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.475849 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.475866 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.475888 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.475906 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.578456 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.578659 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.578674 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.578691 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.578703 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.682021 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.682091 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.682117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.682150 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.682172 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.784305 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.784357 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.784375 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.784398 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.784414 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.841124 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.841179 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.841317 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.841335 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.841335 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.841387 4761 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.841407 4761 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.841349 4761 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.841480 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.841455594 +0000 UTC m=+147.145214258 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.841531 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.841510716 +0000 UTC m=+147.145269340 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.887335 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.887414 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.887446 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.887477 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.887503 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.942442 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.942660 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.942627565 +0000 UTC m=+147.246386229 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.942722 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.942775 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.942893 4761 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.942897 4761 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.942960 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.942939444 +0000 UTC m=+147.246698148 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:32:23 crc kubenswrapper[4761]: E1201 10:32:23.942977 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.942969495 +0000 UTC m=+147.246728119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.990903 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.990962 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.990980 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.991004 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4761]: I1201 10:32:23.991022 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.093324 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.093387 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.093399 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.093416 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.093428 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.127603 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.127700 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:24 crc kubenswrapper[4761]: E1201 10:32:24.127751 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.127624 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:24 crc kubenswrapper[4761]: E1201 10:32:24.127826 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:24 crc kubenswrapper[4761]: E1201 10:32:24.127985 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.196791 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.197208 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.197378 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.197519 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.197697 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.300725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.300769 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.300781 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.300799 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.300813 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.403320 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.403393 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.403416 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.403444 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.403463 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.506645 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.506688 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.506701 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.506717 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.506730 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.609621 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.609695 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.609718 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.609749 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.609775 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.713730 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.713778 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.713791 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.713807 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.713818 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.816385 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.816453 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.816466 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.816482 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.816491 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.919134 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.919204 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.919223 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.919245 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4761]: I1201 10:32:24.919262 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.021521 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.021592 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.021608 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.021626 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.021652 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.124499 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.124572 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.124587 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.124602 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.124613 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.128232 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:25 crc kubenswrapper[4761]: E1201 10:32:25.128365 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.227198 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.227247 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.227260 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.227276 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.227289 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.329642 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.329689 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.329701 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.329720 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.329732 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.432464 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.432515 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.432527 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.432574 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.432588 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.535073 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.535133 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.535152 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.535175 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.535197 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.637862 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.637921 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.637933 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.637952 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.637965 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.741154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.741236 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.741254 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.741281 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.741298 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.843027 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.843078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.843092 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.843112 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.843125 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.945749 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.945805 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.945820 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.945837 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4761]: I1201 10:32:25.945851 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.049610 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.049659 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.049672 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.049689 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.049701 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.128113 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.128165 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.128236 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.128298 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.128393 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.128536 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.152848 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.152947 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.152970 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.153004 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.153028 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.256608 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.256683 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.256708 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.256739 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.256763 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.277264 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.278131 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.278403 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.278690 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.278934 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.297320 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.302867 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.302918 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.302935 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.302957 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.302974 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.319225 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.324301 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.324343 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.324354 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.324368 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.324377 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.337987 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.341954 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.342005 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.342020 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.342040 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.342055 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.358153 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.363373 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.363427 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.363440 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.363457 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.363471 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.382521 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:26Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:26 crc kubenswrapper[4761]: E1201 10:32:26.382745 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.384691 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.384739 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.384755 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.384773 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.384789 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.487982 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.488058 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.488071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.488091 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.488101 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.590865 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.590943 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.590962 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.590988 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.591008 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.693424 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.693487 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.693499 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.693515 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.693526 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.796315 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.796383 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.796405 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.796436 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.796462 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.899685 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.899760 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.899778 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.899800 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4761]: I1201 10:32:26.899817 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.003193 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.003233 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.003244 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.003260 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.003272 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.105755 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.105810 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.105821 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.105835 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.105848 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.128223 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:27 crc kubenswrapper[4761]: E1201 10:32:27.129906 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.131078 4761 scope.go:117] "RemoveContainer" containerID="f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30" Dec 01 10:32:27 crc kubenswrapper[4761]: E1201 10:32:27.132094 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.209012 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.209079 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.209091 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.209111 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.209123 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.312916 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.312959 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.312969 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.312985 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.312996 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.415797 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.415878 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.415896 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.415918 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.415935 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.519665 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.519738 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.519754 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.519777 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.519796 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.623063 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.623121 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.623134 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.623154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.623166 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.726491 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.726596 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.726607 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.726628 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.726638 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.829085 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.829156 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.829166 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.829187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.829199 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.931994 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.932041 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.932054 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.932072 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4761]: I1201 10:32:27.932085 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.035829 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.035894 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.035905 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.035927 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.035941 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.128047 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:28 crc kubenswrapper[4761]: E1201 10:32:28.128228 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.128282 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.128283 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:28 crc kubenswrapper[4761]: E1201 10:32:28.128655 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:28 crc kubenswrapper[4761]: E1201 10:32:28.128930 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.139429 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.139498 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.139511 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.139535 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.139585 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.242471 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.242581 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.242607 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.242637 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.242659 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.346024 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.346094 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.346113 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.346138 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.346155 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.448378 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.448459 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.448482 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.448507 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.448528 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.551833 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.551917 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.551950 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.551980 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.552003 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.654545 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.654649 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.654689 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.654721 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.654742 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.758253 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.758342 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.758353 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.758368 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.758382 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.862194 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.862266 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.862286 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.862308 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.862327 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.964976 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.965043 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.965054 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.965076 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4761]: I1201 10:32:28.965093 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.068093 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.068155 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.068170 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.068193 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.068207 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.128034 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:29 crc kubenswrapper[4761]: E1201 10:32:29.128395 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.142932 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.171770 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.171832 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.171843 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.171863 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.171877 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.274426 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.274463 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.274476 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.274491 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.274503 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.384333 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.384394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.384407 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.384428 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.384441 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.488170 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.488211 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.488219 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.488233 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.488241 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.591425 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.591609 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.591633 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.591700 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.591720 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.694726 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.694828 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.694852 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.694931 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.694953 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.797863 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.797923 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.797940 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.797968 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.797984 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.900889 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.900946 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.900962 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.900986 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4761]: I1201 10:32:29.901006 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.004009 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.004094 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.004132 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.004164 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.004185 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.107040 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.107078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.107089 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.107106 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.107119 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.133520 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:30 crc kubenswrapper[4761]: E1201 10:32:30.133704 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.133579 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:30 crc kubenswrapper[4761]: E1201 10:32:30.133830 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.133739 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:30 crc kubenswrapper[4761]: E1201 10:32:30.133916 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.210350 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.210384 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.210394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.210409 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.210420 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.312882 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.312942 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.312959 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.312982 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.313002 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.415813 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.415887 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.415911 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.415937 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.415955 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.519676 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.519756 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.519782 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.519814 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.519837 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.623536 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.623595 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.623606 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.623621 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.623631 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.726646 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.726702 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.726711 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.726728 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.726739 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.829621 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.829681 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.829694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.829747 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.829785 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.932398 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.932475 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.932492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.932514 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4761]: I1201 10:32:30.932530 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.035608 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.035674 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.035690 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.035709 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.035723 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.127816 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:31 crc kubenswrapper[4761]: E1201 10:32:31.127968 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.139036 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.139074 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.139115 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.139132 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.139144 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.144329 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eaf56ffe-a6c0-446a-81db-deae9bd72c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6009b18a39624ad866faf8d0e2952374083acbd92c396c2a269e8966d75d65d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wvvs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qjx5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.163262 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nz6qt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9149d7-77b0-4df1-8d1a-5a94ef00463a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:08Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb\\\\n2025-12-01T10:31:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ec013d1b-b899-4a55-8fda-1ed7d84507cb to /host/opt/cni/bin/\\\\n2025-12-01T10:31:23Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:23Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b5zp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nz6qt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.193357 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d92e98-6ec4-4451-adff-e0e3842d7c55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1331908f3e554e83111cabecd0a65d727e7ae7a91bf87b62309477497bf0ef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://692f6fa724dbf77247ee24c35356e333e07c619db9a89665d7ef89ecf77e2bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://502292af244de0e4c8f0c0e68579cd1105097c9801597c4df55b3aab9413bb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25fb3274009e1380090fa77caa313b4586dd5552bf818d4c59803078b54bd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b84190f09ed72f117ef1f8144e167f674fbdbcd24e294521a5ba1ef5edd6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275b20f76911ae292aa10c819f4145a8d9fcef45bf2274df8559aec838fb5a3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4422d1d711f3d656860390b5ddb7333da8310424a4c799c69003c2074365359\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4f86d3d3ce8472cd5dfb2c5d8a0a99f4752c8336772f835057e563f59c122c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.211459 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"401f34d6-1db1-49fc-b016-73a397bcd9d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 10:31:19.598519 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 10:31:19.598988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:19.601353 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4008524456/tls.crt::/tmp/serving-cert-4008524456/tls.key\\\\\\\"\\\\nI1201 10:31:19.905457 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:19.907077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:19.907092 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:19.907112 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:19.907117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:19.916068 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:19.916109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916120 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:19.916131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:19.916138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:19.916143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:19.916149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 10:31:19.916077 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 10:31:19.918074 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.228810 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbc2baf5-02f4-4348-82ae-18efcc665fc5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59ec22e69994cd35b7408db30cd2c9b17b7e622e8233f1fb267badfb8237e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b2c5f40fc6e8be8705a95a72408debb7415b95e95cd89fc049e53d8ab4c5bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130353959c9a7c747c102fd369190a7df56e330ebfe59946b96bd3c28bb30cb0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.240912 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f93a6bc-0d7b-48d0-a387-7cd07a41477c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f220a2f2cb387b1c60b9baf8366a1acc33c5686365ff5c76eda254de2656ae8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://510a21630c09190cc5ec401f65e4ceaeebc916831e089aca1eecc75e72001326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9ad2a514d91a1c32e40acb9e436eeb6ea6d5fb4c79415d87688f36b5d4a6fb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49548a9e5da92e32957cc1657c8eee41903325c12fb187f22ee32971736b1e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.241915 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.241960 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.241971 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.241986 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.241995 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.254446 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.265402 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5074ae-d015-4a93-92af-e25eb90d7868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f5fedd93471d67b094b64485d810027122bd9557ab170623f04cffc87d2c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d5e1f94061c640712c48384a939b06bf428350b9556adf244309e9ab2e899d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07d5e1f94061c640712c48384a939b06bf428350b9556adf244309e9ab2e899d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.282728 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.296429 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jbqqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b958982-d6cc-45e7-b3f4-1684bfa145bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f783df10e26def66a2e123fd9a9623c6db75923e79fc640f2acfcc8c4539531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxz47\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jbqqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.326936 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-etcd/etcd-crc\\\\nI1201 10:32:14.962026 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1201 10:32:14.962164 6781 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1201 10:32:14.962197 6781 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1201 10:32:14.962213 6781 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expir\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96n87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pllhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.340948 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae4cc8e6396443abd867df0a9a9d6c6a52978922cb81fe3d93fd9c7242c4f9a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.344072 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.344167 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.344187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.344249 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.344267 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.353469 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zx6x8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb7be4c9-95e2-452c-9c8d-6bc18b8ff387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c33c81cd6c3976f7365d48c56975763668cb1cfc66d52e39f835400148dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvprm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zx6x8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.367679 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b7d821-1028-4cfc-8a6b-efd9142b60c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f63b7256212231041ab2febc911d470f2705585f49d8d767d6f272c920dde40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19d97d2da98b4ab8122848158fc4e4c6c7820dd9c628760c179f732b1e4d789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5h9x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwhnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.379911 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-86rp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d0c868-c268-4723-9323-6937c06b4ea9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-86rp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.396476 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249b2ff3252a96a15abf0ff224bf95941b1cae4199a474e0fec44c3af612664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.412211 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.427269 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510b4f20e4e6512bb927b7de8d6726b57067e1690c7ceb9e53eb44346341ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52fd4b19c1aa7ebf6475c5f2329a98e427a75316f03a7d4818a196c82055610a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.444885 4761 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sv24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70f872ad-e694-4743-8269-72456cb8d037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5be66251c6e9023fd6275029cd154268fe021f9968efc4943e045e390b119c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12fce126fcaa506f6b389675ebec760cc6e86f9f4b15cb7344dd4c2186b36677\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a537b3a92065e849fb639a301761bda608dfe66892cbba6f8ea67209e04876d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4b5fb2f359c292b6866404b70ab903d0995b4b78db75a804630f50ee8c78406\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9981443176e49c03682686826ba55d8ba71a525ab7a747603264ea759fea3e4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b06e278cd4db4193f66d6b412d21d0558ed749b64db2b3aa3e9b18131c708d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f69bd38e275105e6a45f86cf0b0523e92204c78b2f650244a067de49247223b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl8pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sv24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.446414 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.446465 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.446481 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.446499 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.446511 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.549209 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.549406 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.549499 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.549631 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.549661 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.652715 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.652788 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.652815 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.652880 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.652904 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.755781 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.755922 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.756111 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.756479 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.756587 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.859774 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.859838 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.859866 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.859913 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.859937 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.963025 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.963078 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.963092 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.963110 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4761]: I1201 10:32:31.963123 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.065595 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.065651 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.065665 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.065690 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.065702 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.127808 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.127947 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:32 crc kubenswrapper[4761]: E1201 10:32:32.128091 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:32 crc kubenswrapper[4761]: E1201 10:32:32.128282 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.128739 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:32 crc kubenswrapper[4761]: E1201 10:32:32.128936 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.169000 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.169069 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.169088 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.169114 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.169131 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.271491 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.271543 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.271585 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.271601 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.271613 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.374082 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.374117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.374125 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.374139 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.374149 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.476891 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.476953 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.476972 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.476996 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.477014 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.580169 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.580222 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.580235 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.580251 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.580261 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.683572 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.683626 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.683637 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.683657 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.683670 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.786118 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.786181 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.786203 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.786234 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.786256 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.890482 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.890783 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.890804 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.891883 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.891952 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.994998 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.995070 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.995086 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.995104 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4761]: I1201 10:32:32.995116 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.098117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.098170 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.098187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.098308 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.098332 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.127506 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:33 crc kubenswrapper[4761]: E1201 10:32:33.127697 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.201619 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.201687 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.201699 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.201715 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.201726 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.304337 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.304384 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.304396 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.304413 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.304425 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.406803 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.406859 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.406873 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.406892 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.406906 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.510033 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.510075 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.510087 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.510103 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.510115 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.612286 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.612383 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.612410 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.612439 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.612460 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.714993 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.715064 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.715076 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.715091 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.715101 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.817621 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.817699 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.817723 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.817751 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.817768 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.919889 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.919930 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.919945 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.919961 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4761]: I1201 10:32:33.919973 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.022183 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.022228 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.022240 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.022257 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.022269 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.125779 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.125902 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.125924 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.125956 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.125977 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.128177 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.128256 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.128177 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:34 crc kubenswrapper[4761]: E1201 10:32:34.128375 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:34 crc kubenswrapper[4761]: E1201 10:32:34.128478 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:34 crc kubenswrapper[4761]: E1201 10:32:34.128625 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.229606 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.229673 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.229699 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.229730 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.229756 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.332677 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.332748 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.332772 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.332804 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.332828 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.436654 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.436721 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.436738 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.436763 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.436782 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.539399 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.539453 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.539466 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.539486 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.539498 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.643364 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.643421 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.643435 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.643454 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.643468 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.745644 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.745690 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.745700 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.745715 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.745726 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.848058 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.848102 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.848115 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.848133 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.848146 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.951146 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.951195 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.951206 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.951224 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4761]: I1201 10:32:34.951235 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.054419 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.054486 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.054495 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.054515 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.054527 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.128948 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:35 crc kubenswrapper[4761]: E1201 10:32:35.129182 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.157959 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.158103 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.158129 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.158166 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.158189 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.260655 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.260706 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.260720 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.260739 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.260751 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.363244 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.363305 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.363317 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.363332 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.363343 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.465638 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.465695 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.465707 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.465727 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.465742 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.568300 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.568340 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.568353 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.568372 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.568383 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.670383 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.670418 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.670426 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.670441 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.670450 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.772932 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.772979 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.772992 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.773009 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.773021 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.875458 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.875498 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.875509 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.875523 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.875533 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.979633 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.979733 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.979790 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.979818 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4761]: I1201 10:32:35.979882 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.082731 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.082825 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.082876 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.082901 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.082917 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.128037 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.128137 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.128216 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.128361 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.128469 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.128760 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.186451 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.186504 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.186515 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.186534 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.186569 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.290430 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.290477 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.290490 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.290507 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.290520 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.393061 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.393130 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.393142 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.393160 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.393174 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.495527 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.495629 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.495642 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.496032 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.496070 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.599591 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.599658 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.599675 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.599699 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.599716 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.620512 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.620581 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.620593 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.620610 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.620621 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.634250 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.638380 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.638420 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.638430 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.638444 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.638455 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.652736 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.656286 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.656317 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.656325 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.656340 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.656348 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.667142 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.670742 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.670780 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.670788 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.670851 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.670863 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.685010 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.688810 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.688840 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.688849 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.688861 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.688870 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.702542 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e43c0780-f8b7-40cc-82a5-0e835247b9ef\\\",\\\"systemUUID\\\":\\\"ec505933-0668-4f39-8d86-8e4b6f0f3c38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4761]: E1201 10:32:36.702731 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.704334 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.704361 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.704369 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.704383 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.704394 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.807000 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.807056 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.807071 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.807091 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.807106 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.911564 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.911607 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.911615 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.911631 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4761]: I1201 10:32:36.911640 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.014362 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.014430 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.014448 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.014475 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.014497 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.116943 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.116982 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.116992 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.117008 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.117019 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.128339 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:37 crc kubenswrapper[4761]: E1201 10:32:37.128678 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.219922 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.220054 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.220126 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.220164 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.220249 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.323691 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.323739 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.323749 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.323766 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.323778 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.427359 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.427398 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.427408 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.427431 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.427442 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.530039 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.530074 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.530084 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.530098 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.530107 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.632971 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.633025 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.633034 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.633048 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.633057 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.736227 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.736287 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.736304 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.736326 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.736344 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.838772 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.838807 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.838818 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.838854 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.838864 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.959525 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.959586 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.959600 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.959626 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4761]: I1201 10:32:37.959641 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.063104 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.063172 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.063184 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.063200 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.063211 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.128288 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.128419 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:38 crc kubenswrapper[4761]: E1201 10:32:38.128582 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.128603 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:38 crc kubenswrapper[4761]: E1201 10:32:38.128771 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:38 crc kubenswrapper[4761]: E1201 10:32:38.128832 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.165737 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.165793 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.165809 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.165832 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.165849 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.269127 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.269196 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.269214 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.269238 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.269256 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.372984 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.373057 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.373073 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.373097 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.373116 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.476137 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.476176 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.476187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.476202 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.476215 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.578596 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.578650 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.578662 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.578677 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.578689 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.681234 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.681291 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.681312 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.681334 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.681350 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.783637 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.783678 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.783689 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.783704 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.783716 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.886457 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.886528 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.886587 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.886616 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.886634 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.989384 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.989442 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.989460 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.989483 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4761]: I1201 10:32:38.989499 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.091940 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.092032 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.092064 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.092094 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.092118 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.128781 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:39 crc kubenswrapper[4761]: E1201 10:32:39.130048 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.195518 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.195627 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.195653 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.195682 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.195702 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.298792 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.298838 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.298851 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.298866 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.298877 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.402692 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.403667 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.403704 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.403725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.403737 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.494209 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:39 crc kubenswrapper[4761]: E1201 10:32:39.494383 4761 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:39 crc kubenswrapper[4761]: E1201 10:32:39.494468 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs podName:65d0c868-c268-4723-9323-6937c06b4ea9 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.494447404 +0000 UTC m=+162.798206028 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs") pod "network-metrics-daemon-86rp7" (UID: "65d0c868-c268-4723-9323-6937c06b4ea9") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.509930 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.510003 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.510018 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.510038 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.510057 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.613184 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.613492 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.613607 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.613711 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.613804 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.716013 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.716072 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.716089 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.716113 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.716129 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.818723 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.818777 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.818789 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.818807 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.818819 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.921901 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.922186 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.922274 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.922366 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4761]: I1201 10:32:39.922451 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.024846 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.025153 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.025232 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.025326 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.025415 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.127367 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.127488 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:40 crc kubenswrapper[4761]: E1201 10:32:40.127638 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.127712 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:40 crc kubenswrapper[4761]: E1201 10:32:40.127875 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:40 crc kubenswrapper[4761]: E1201 10:32:40.127982 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.129126 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.129229 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.129251 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.129273 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.129333 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.232042 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.232161 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.232187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.232344 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.232367 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.336306 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.336355 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.336369 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.336387 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.336399 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.439894 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.439939 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.439949 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.439969 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.439981 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.542739 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.542785 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.542797 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.542813 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.542825 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.645736 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.645825 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.645839 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.645856 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.645869 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.749599 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.749663 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.749679 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.749707 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.749725 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.852325 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.852367 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.852379 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.852394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.852405 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.954695 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.954732 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.954740 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.954752 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4761]: I1201 10:32:40.954761 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.057110 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.057167 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.057183 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.057206 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.057223 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.128152 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:41 crc kubenswrapper[4761]: E1201 10:32:41.128369 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.159631 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.159707 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.159717 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.159757 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.159771 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.193417 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jbqqz" podStartSLOduration=81.19339038 podStartE2EDuration="1m21.19339038s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.156700122 +0000 UTC m=+100.460458746" watchObservedRunningTime="2025-12-01 10:32:41.19339038 +0000 UTC m=+100.497149044" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.208187 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=12.208159512 podStartE2EDuration="12.208159512s" podCreationTimestamp="2025-12-01 10:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.207918045 +0000 UTC m=+100.511676739" watchObservedRunningTime="2025-12-01 10:32:41.208159512 +0000 UTC m=+100.511918166" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.247054 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwhnl" podStartSLOduration=80.247032543 podStartE2EDuration="1m20.247032543s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.24623506 +0000 UTC m=+100.549993704" watchObservedRunningTime="2025-12-01 10:32:41.247032543 +0000 UTC m=+100.550791187" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.267499 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.267593 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.267614 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.267638 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.267666 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.309724 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zx6x8" podStartSLOduration=81.309695154 podStartE2EDuration="1m21.309695154s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.2913779 +0000 UTC m=+100.595136514" watchObservedRunningTime="2025-12-01 10:32:41.309695154 +0000 UTC m=+100.613453808" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.351655 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8sv24" podStartSLOduration=81.351637942 podStartE2EDuration="1m21.351637942s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.331723283 +0000 UTC m=+100.635481917" watchObservedRunningTime="2025-12-01 10:32:41.351637942 +0000 UTC m=+100.655396566" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.375198 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.375236 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.375246 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.375260 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.375270 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.383859 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=80.383839722 podStartE2EDuration="1m20.383839722s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.383347728 +0000 UTC m=+100.687106372" watchObservedRunningTime="2025-12-01 10:32:41.383839722 +0000 UTC m=+100.687598336" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.397194 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.397169843 podStartE2EDuration="46.397169843s" podCreationTimestamp="2025-12-01 10:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.396312629 +0000 UTC m=+100.700071253" watchObservedRunningTime="2025-12-01 10:32:41.397169843 +0000 UTC m=+100.700928477" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.420969 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podStartSLOduration=81.420947493 podStartE2EDuration="1m21.420947493s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.420941453 +0000 UTC m=+100.724700097" watchObservedRunningTime="2025-12-01 10:32:41.420947493 +0000 UTC m=+100.724706117" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.436349 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nz6qt" podStartSLOduration=81.436325482 podStartE2EDuration="1m21.436325482s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.436307742 +0000 UTC m=+100.740066376" watchObservedRunningTime="2025-12-01 10:32:41.436325482 +0000 UTC m=+100.740084106" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.461103 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.46108302 podStartE2EDuration="1m20.46108302s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.460013559 +0000 UTC m=+100.763772183" watchObservedRunningTime="2025-12-01 10:32:41.46108302 +0000 UTC m=+100.764841644" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.477951 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.478231 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.478320 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.478404 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.478511 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.581338 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.581394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.581406 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.581424 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.581438 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.684315 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.684366 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.684377 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.684394 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.684407 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.787214 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.787255 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.787269 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.787288 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.787301 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.889931 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.889990 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.890001 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.890016 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.890028 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.995611 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.995659 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.995671 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.995694 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4761]: I1201 10:32:41.995708 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.098531 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.098633 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.098651 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.098675 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.098695 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.127769 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:42 crc kubenswrapper[4761]: E1201 10:32:42.127907 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.127929 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.127929 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:42 crc kubenswrapper[4761]: E1201 10:32:42.128240 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:42 crc kubenswrapper[4761]: E1201 10:32:42.128816 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.129279 4761 scope.go:117] "RemoveContainer" containerID="f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30" Dec 01 10:32:42 crc kubenswrapper[4761]: E1201 10:32:42.129522 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pllhm_openshift-ovn-kubernetes(463dbf7c-b2d9-4f91-819c-f74a30d5d01b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.200890 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.200929 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.200937 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.200952 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.200961 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.303638 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.303684 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.303695 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.303714 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.303725 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.406057 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.406169 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.406204 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.406236 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.406257 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.509725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.509784 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.509801 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.509824 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.509843 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.613344 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.613404 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.613422 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.613448 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.613464 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.715908 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.715972 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.715988 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.716010 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.716030 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.819179 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.819260 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.819286 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.819318 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.819342 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.923003 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.923091 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.923109 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.923135 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4761]: I1201 10:32:42.923154 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.025512 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.025586 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.025603 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.025623 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.025638 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.127446 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:43 crc kubenswrapper[4761]: E1201 10:32:43.127930 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.129031 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.129098 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.129117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.129142 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.129161 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.232441 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.232485 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.232497 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.232514 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.232527 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.335614 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.335651 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.335660 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.335677 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.335686 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.438666 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.438727 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.438743 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.438769 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.438791 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.541734 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.541775 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.541783 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.541797 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.541806 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.650118 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.650211 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.650236 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.650299 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.650324 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.753504 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.753834 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.753857 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.753886 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.753905 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.856846 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.856917 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.856941 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.856972 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.856993 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.958886 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.958923 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.958932 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.958945 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4761]: I1201 10:32:43.958954 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.061740 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.061808 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.061819 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.061835 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.061853 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.128344 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.128436 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.128358 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:44 crc kubenswrapper[4761]: E1201 10:32:44.128625 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:44 crc kubenswrapper[4761]: E1201 10:32:44.128722 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:44 crc kubenswrapper[4761]: E1201 10:32:44.128862 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.164874 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.164917 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.164927 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.164943 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.164956 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.267057 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.267104 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.267116 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.267134 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.267147 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.370460 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.370541 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.370610 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.370642 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.370664 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.473046 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.473101 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.473117 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.473142 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.473159 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.575725 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.575795 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.575818 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.575847 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.575869 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.679019 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.679064 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.679079 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.679099 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.679114 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.782012 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.782058 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.782069 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.782108 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.782119 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.885401 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.885473 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.885486 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.885503 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.885516 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.987871 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.987919 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.987954 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.987973 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4761]: I1201 10:32:44.987984 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.089847 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.089888 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.089899 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.089917 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.089930 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.128349 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:45 crc kubenswrapper[4761]: E1201 10:32:45.128831 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.193232 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.193266 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.193275 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.193289 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.193299 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.295613 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.295641 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.295650 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.295664 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.295673 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.398305 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.398381 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.398403 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.398431 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.398451 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.501668 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.501739 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.501759 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.501783 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.501802 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.604044 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.604113 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.604130 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.604154 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.604170 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.707135 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.707210 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.707233 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.707261 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.707283 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.809972 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.810023 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.810038 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.810057 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.810072 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.912412 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.912464 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.912478 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.912495 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4761]: I1201 10:32:45.912507 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.014977 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.015015 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.015028 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.015046 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.015077 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.118628 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.118704 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.118727 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.118756 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.118774 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.127979 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.128080 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:46 crc kubenswrapper[4761]: E1201 10:32:46.128124 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.128001 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:46 crc kubenswrapper[4761]: E1201 10:32:46.128235 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:46 crc kubenswrapper[4761]: E1201 10:32:46.128457 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.221056 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.221101 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.221112 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.221128 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.221139 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.324351 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.324416 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.324442 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.324469 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.324490 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.428310 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.428404 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.428423 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.428450 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.428467 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.531656 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.531714 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.531733 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.531757 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.531775 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.634113 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.634171 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.634187 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.634211 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.634232 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.738005 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.738070 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.738087 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.738111 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.738128 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.841733 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.841788 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.841799 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.841817 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.841830 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.944875 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.944925 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.944935 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.944952 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4761]: I1201 10:32:46.944963 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.048335 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.048415 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.048438 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.048529 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.048621 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.088173 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.088234 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.088248 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.088266 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.088279 4761 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.128442 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:47 crc kubenswrapper[4761]: E1201 10:32:47.128601 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.160657 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.160577742 podStartE2EDuration="1m27.160577742s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:41.478832047 +0000 UTC m=+100.782590691" watchObservedRunningTime="2025-12-01 10:32:47.160577742 +0000 UTC m=+106.464336396" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.163345 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf"] Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.164067 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.167646 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.168020 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.168195 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.168618 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.292847 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/55520faa-789f-4bd8-b985-0fbb2b569599-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.292929 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55520faa-789f-4bd8-b985-0fbb2b569599-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.293012 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/55520faa-789f-4bd8-b985-0fbb2b569599-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.293052 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55520faa-789f-4bd8-b985-0fbb2b569599-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.293097 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55520faa-789f-4bd8-b985-0fbb2b569599-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.394186 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/55520faa-789f-4bd8-b985-0fbb2b569599-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.394252 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55520faa-789f-4bd8-b985-0fbb2b569599-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.394304 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/55520faa-789f-4bd8-b985-0fbb2b569599-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.394410 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55520faa-789f-4bd8-b985-0fbb2b569599-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.394504 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55520faa-789f-4bd8-b985-0fbb2b569599-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.394302 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/55520faa-789f-4bd8-b985-0fbb2b569599-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.394853 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/55520faa-789f-4bd8-b985-0fbb2b569599-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.396398 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55520faa-789f-4bd8-b985-0fbb2b569599-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.402029 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55520faa-789f-4bd8-b985-0fbb2b569599-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.426886 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55520faa-789f-4bd8-b985-0fbb2b569599-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g9vnf\" (UID: \"55520faa-789f-4bd8-b985-0fbb2b569599\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.481524 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.749947 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" event={"ID":"55520faa-789f-4bd8-b985-0fbb2b569599","Type":"ContainerStarted","Data":"c6836a9ad21a916d24809ee7a94343ee75c8b718fd860e61d67a7910b98ffad2"} Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.750261 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" event={"ID":"55520faa-789f-4bd8-b985-0fbb2b569599","Type":"ContainerStarted","Data":"971a04b6005dd1e738d8b41d75dff476e9c64afe93df74de795fd010483ceec4"} Dec 01 10:32:47 crc kubenswrapper[4761]: I1201 10:32:47.768581 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g9vnf" podStartSLOduration=87.768537243 podStartE2EDuration="1m27.768537243s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:47.76807289 +0000 UTC m=+107.071831554" watchObservedRunningTime="2025-12-01 10:32:47.768537243 +0000 UTC m=+107.072295867" Dec 01 10:32:48 crc kubenswrapper[4761]: I1201 10:32:48.127983 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:48 crc kubenswrapper[4761]: I1201 10:32:48.128011 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:48 crc kubenswrapper[4761]: E1201 10:32:48.128096 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:48 crc kubenswrapper[4761]: I1201 10:32:48.127995 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:48 crc kubenswrapper[4761]: E1201 10:32:48.128274 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:48 crc kubenswrapper[4761]: E1201 10:32:48.128257 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:49 crc kubenswrapper[4761]: I1201 10:32:49.128140 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:49 crc kubenswrapper[4761]: E1201 10:32:49.128351 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:50 crc kubenswrapper[4761]: I1201 10:32:50.127981 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:50 crc kubenswrapper[4761]: I1201 10:32:50.128040 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:50 crc kubenswrapper[4761]: I1201 10:32:50.128058 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:50 crc kubenswrapper[4761]: E1201 10:32:50.128136 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:50 crc kubenswrapper[4761]: E1201 10:32:50.128233 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:50 crc kubenswrapper[4761]: E1201 10:32:50.128344 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:51 crc kubenswrapper[4761]: I1201 10:32:51.128257 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:51 crc kubenswrapper[4761]: E1201 10:32:51.130785 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:52 crc kubenswrapper[4761]: I1201 10:32:52.127696 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:52 crc kubenswrapper[4761]: I1201 10:32:52.127797 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:52 crc kubenswrapper[4761]: E1201 10:32:52.127826 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:52 crc kubenswrapper[4761]: E1201 10:32:52.127995 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:52 crc kubenswrapper[4761]: I1201 10:32:52.127797 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:52 crc kubenswrapper[4761]: E1201 10:32:52.128178 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:53 crc kubenswrapper[4761]: I1201 10:32:53.127599 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:53 crc kubenswrapper[4761]: E1201 10:32:53.127848 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:54 crc kubenswrapper[4761]: I1201 10:32:54.128061 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:54 crc kubenswrapper[4761]: I1201 10:32:54.128127 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:54 crc kubenswrapper[4761]: E1201 10:32:54.128191 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:54 crc kubenswrapper[4761]: I1201 10:32:54.128072 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:54 crc kubenswrapper[4761]: E1201 10:32:54.128279 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:54 crc kubenswrapper[4761]: E1201 10:32:54.128408 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:55 crc kubenswrapper[4761]: I1201 10:32:55.128521 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:55 crc kubenswrapper[4761]: E1201 10:32:55.128785 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:55 crc kubenswrapper[4761]: I1201 10:32:55.777023 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/1.log" Dec 01 10:32:55 crc kubenswrapper[4761]: I1201 10:32:55.778200 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/0.log" Dec 01 10:32:55 crc kubenswrapper[4761]: I1201 10:32:55.778248 4761 generic.go:334] "Generic (PLEG): container finished" podID="7a9149d7-77b0-4df1-8d1a-5a94ef00463a" containerID="4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128" exitCode=1 Dec 01 10:32:55 crc kubenswrapper[4761]: I1201 10:32:55.778281 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nz6qt" event={"ID":"7a9149d7-77b0-4df1-8d1a-5a94ef00463a","Type":"ContainerDied","Data":"4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128"} Dec 01 10:32:55 crc kubenswrapper[4761]: I1201 10:32:55.778316 4761 scope.go:117] "RemoveContainer" containerID="5ac495da433f4e763cc88d421a2944df550ec5eb4effdd695a2fb6a269253dfb" Dec 01 10:32:55 crc kubenswrapper[4761]: I1201 10:32:55.778832 4761 scope.go:117] "RemoveContainer" containerID="4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128" Dec 01 10:32:55 crc kubenswrapper[4761]: E1201 10:32:55.779013 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nz6qt_openshift-multus(7a9149d7-77b0-4df1-8d1a-5a94ef00463a)\"" pod="openshift-multus/multus-nz6qt" podUID="7a9149d7-77b0-4df1-8d1a-5a94ef00463a" Dec 01 10:32:56 crc kubenswrapper[4761]: I1201 10:32:56.127943 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:56 crc kubenswrapper[4761]: I1201 10:32:56.127995 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:56 crc kubenswrapper[4761]: I1201 10:32:56.127946 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:56 crc kubenswrapper[4761]: E1201 10:32:56.128222 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:56 crc kubenswrapper[4761]: E1201 10:32:56.128351 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:56 crc kubenswrapper[4761]: E1201 10:32:56.128530 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:56 crc kubenswrapper[4761]: I1201 10:32:56.784960 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/1.log" Dec 01 10:32:57 crc kubenswrapper[4761]: I1201 10:32:57.127583 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:57 crc kubenswrapper[4761]: E1201 10:32:57.127716 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:57 crc kubenswrapper[4761]: I1201 10:32:57.128776 4761 scope.go:117] "RemoveContainer" containerID="f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30" Dec 01 10:32:57 crc kubenswrapper[4761]: I1201 10:32:57.790757 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/3.log" Dec 01 10:32:57 crc kubenswrapper[4761]: I1201 10:32:57.793113 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerStarted","Data":"66d185ea008facfbf66c4693ed2abbb7d581c51a627a47074fc8cc3a1292b153"} Dec 01 10:32:57 crc kubenswrapper[4761]: I1201 10:32:57.793576 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:32:57 crc kubenswrapper[4761]: I1201 10:32:57.818774 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podStartSLOduration=96.818752435 podStartE2EDuration="1m36.818752435s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:32:57.817899971 +0000 UTC m=+117.121658615" watchObservedRunningTime="2025-12-01 10:32:57.818752435 +0000 UTC m=+117.122511079" Dec 01 10:32:57 crc kubenswrapper[4761]: I1201 10:32:57.959124 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-86rp7"] Dec 01 10:32:57 crc kubenswrapper[4761]: I1201 10:32:57.959228 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:32:57 crc kubenswrapper[4761]: E1201 10:32:57.959307 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:32:58 crc kubenswrapper[4761]: I1201 10:32:58.127409 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:58 crc kubenswrapper[4761]: I1201 10:32:58.127434 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:58 crc kubenswrapper[4761]: E1201 10:32:58.127569 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:58 crc kubenswrapper[4761]: I1201 10:32:58.127622 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:58 crc kubenswrapper[4761]: E1201 10:32:58.127793 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:58 crc kubenswrapper[4761]: E1201 10:32:58.127993 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:00 crc kubenswrapper[4761]: I1201 10:33:00.128587 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:00 crc kubenswrapper[4761]: I1201 10:33:00.128705 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:00 crc kubenswrapper[4761]: E1201 10:33:00.128810 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:00 crc kubenswrapper[4761]: I1201 10:33:00.128633 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:00 crc kubenswrapper[4761]: E1201 10:33:00.128934 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:00 crc kubenswrapper[4761]: I1201 10:33:00.128648 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:00 crc kubenswrapper[4761]: E1201 10:33:00.129086 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:00 crc kubenswrapper[4761]: E1201 10:33:00.129198 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:01 crc kubenswrapper[4761]: E1201 10:33:01.174976 4761 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 10:33:01 crc kubenswrapper[4761]: E1201 10:33:01.361497 4761 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:33:02 crc kubenswrapper[4761]: I1201 10:33:02.127509 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:02 crc kubenswrapper[4761]: I1201 10:33:02.127573 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:02 crc kubenswrapper[4761]: I1201 10:33:02.127622 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:02 crc kubenswrapper[4761]: I1201 10:33:02.127678 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:02 crc kubenswrapper[4761]: E1201 10:33:02.127792 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:02 crc kubenswrapper[4761]: E1201 10:33:02.127911 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:02 crc kubenswrapper[4761]: E1201 10:33:02.127943 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:02 crc kubenswrapper[4761]: E1201 10:33:02.128000 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:04 crc kubenswrapper[4761]: I1201 10:33:04.128341 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:04 crc kubenswrapper[4761]: I1201 10:33:04.128404 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:04 crc kubenswrapper[4761]: I1201 10:33:04.128353 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:04 crc kubenswrapper[4761]: E1201 10:33:04.128573 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:04 crc kubenswrapper[4761]: E1201 10:33:04.128615 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:04 crc kubenswrapper[4761]: E1201 10:33:04.128680 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:04 crc kubenswrapper[4761]: I1201 10:33:04.130092 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:04 crc kubenswrapper[4761]: E1201 10:33:04.130282 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:06 crc kubenswrapper[4761]: I1201 10:33:06.128285 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:06 crc kubenswrapper[4761]: I1201 10:33:06.128313 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:06 crc kubenswrapper[4761]: I1201 10:33:06.128319 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:06 crc kubenswrapper[4761]: I1201 10:33:06.128282 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:06 crc kubenswrapper[4761]: E1201 10:33:06.128404 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:06 crc kubenswrapper[4761]: E1201 10:33:06.128790 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:06 crc kubenswrapper[4761]: E1201 10:33:06.128901 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:06 crc kubenswrapper[4761]: E1201 10:33:06.129024 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:06 crc kubenswrapper[4761]: E1201 10:33:06.363180 4761 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:33:08 crc kubenswrapper[4761]: I1201 10:33:08.127870 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:08 crc kubenswrapper[4761]: I1201 10:33:08.127958 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:08 crc kubenswrapper[4761]: I1201 10:33:08.127871 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:08 crc kubenswrapper[4761]: I1201 10:33:08.127911 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:08 crc kubenswrapper[4761]: E1201 10:33:08.128076 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:08 crc kubenswrapper[4761]: E1201 10:33:08.128190 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:08 crc kubenswrapper[4761]: E1201 10:33:08.128277 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:08 crc kubenswrapper[4761]: E1201 10:33:08.128335 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:10 crc kubenswrapper[4761]: I1201 10:33:10.128540 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:10 crc kubenswrapper[4761]: I1201 10:33:10.128703 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:10 crc kubenswrapper[4761]: I1201 10:33:10.128616 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:10 crc kubenswrapper[4761]: E1201 10:33:10.128862 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:10 crc kubenswrapper[4761]: E1201 10:33:10.129030 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:10 crc kubenswrapper[4761]: I1201 10:33:10.129166 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:10 crc kubenswrapper[4761]: E1201 10:33:10.129200 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:10 crc kubenswrapper[4761]: E1201 10:33:10.129307 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:11 crc kubenswrapper[4761]: I1201 10:33:11.130451 4761 scope.go:117] "RemoveContainer" containerID="4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128" Dec 01 10:33:11 crc kubenswrapper[4761]: E1201 10:33:11.363852 4761 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:33:12 crc kubenswrapper[4761]: I1201 10:33:12.128085 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:12 crc kubenswrapper[4761]: I1201 10:33:12.128167 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:12 crc kubenswrapper[4761]: I1201 10:33:12.128236 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:12 crc kubenswrapper[4761]: E1201 10:33:12.128376 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:12 crc kubenswrapper[4761]: I1201 10:33:12.128500 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:12 crc kubenswrapper[4761]: E1201 10:33:12.128723 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:12 crc kubenswrapper[4761]: E1201 10:33:12.128824 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:12 crc kubenswrapper[4761]: E1201 10:33:12.128908 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:12 crc kubenswrapper[4761]: I1201 10:33:12.846799 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/1.log" Dec 01 10:33:12 crc kubenswrapper[4761]: I1201 10:33:12.846889 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nz6qt" event={"ID":"7a9149d7-77b0-4df1-8d1a-5a94ef00463a","Type":"ContainerStarted","Data":"5d5ba0b4c00a761700fbb26c07c77c1fefe4b5b54df3f83e70592beb830196eb"} Dec 01 10:33:14 crc kubenswrapper[4761]: I1201 10:33:14.128422 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:14 crc kubenswrapper[4761]: I1201 10:33:14.128475 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:14 crc kubenswrapper[4761]: I1201 10:33:14.128498 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:14 crc kubenswrapper[4761]: I1201 10:33:14.128431 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:14 crc kubenswrapper[4761]: E1201 10:33:14.128721 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:14 crc kubenswrapper[4761]: E1201 10:33:14.128591 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:14 crc kubenswrapper[4761]: E1201 10:33:14.128879 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:14 crc kubenswrapper[4761]: E1201 10:33:14.129106 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:16 crc kubenswrapper[4761]: I1201 10:33:16.128331 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:16 crc kubenswrapper[4761]: I1201 10:33:16.128398 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:16 crc kubenswrapper[4761]: I1201 10:33:16.128404 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:16 crc kubenswrapper[4761]: I1201 10:33:16.128348 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:16 crc kubenswrapper[4761]: E1201 10:33:16.128537 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-86rp7" podUID="65d0c868-c268-4723-9323-6937c06b4ea9" Dec 01 10:33:16 crc kubenswrapper[4761]: E1201 10:33:16.128763 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:16 crc kubenswrapper[4761]: E1201 10:33:16.128991 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:16 crc kubenswrapper[4761]: E1201 10:33:16.129105 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.750325 4761 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.789769 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tfh9j"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.790607 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.791045 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.791738 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.792491 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xzg25"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.793010 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.793983 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.794539 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.794737 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v7r4l"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.795257 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.795354 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-fqctr"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.795661 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fqctr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.796620 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.796816 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.796950 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.797145 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-bz89h"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.797611 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.798017 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.798674 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.799654 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.800372 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.801161 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.802403 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.802774 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.803168 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.803748 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.803806 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.803892 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.803971 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804063 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804126 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804207 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804281 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804325 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804061 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804331 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.803898 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804475 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804578 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804212 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.804678 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.805391 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.805978 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.806282 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.806449 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.806623 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.806799 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.806927 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.807056 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.807144 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.807175 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.807256 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.807181 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.807494 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r4655"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.808026 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.809255 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82k6m"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.810128 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.810805 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.811691 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.811993 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wqskh"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.812547 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.814180 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.814902 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.815461 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.815607 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.815688 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.816163 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.816428 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5s745"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.816477 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.816790 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.816892 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.816794 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pgskl"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.817582 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.826516 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8nfjc"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.830356 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.830718 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.830856 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.830908 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.830849 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.830755 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.856708 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.857174 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.857450 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.860516 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.861093 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.861297 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v7r4l"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.861325 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.861918 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862167 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862178 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1328387a-a550-49b5-92ce-7019cb401bfb-machine-approver-tls\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862210 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5twpb\" (UniqueName: \"kubernetes.io/projected/78cf6292-f923-40a8-9f4e-183d70e31a7f-kube-api-access-5twpb\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862233 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1328387a-a550-49b5-92ce-7019cb401bfb-auth-proxy-config\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862256 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78cf6292-f923-40a8-9f4e-183d70e31a7f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862278 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862298 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06629698-b5a8-41a6-b94b-771abc920e20-audit-dir\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862321 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1328387a-a550-49b5-92ce-7019cb401bfb-config\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862473 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-audit-policies\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862534 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78cf6292-f923-40a8-9f4e-183d70e31a7f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862589 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-serving-cert\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862659 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tggt\" (UniqueName: \"kubernetes.io/projected/1328387a-a550-49b5-92ce-7019cb401bfb-kube-api-access-5tggt\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862686 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78cf6292-f923-40a8-9f4e-183d70e31a7f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862714 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-etcd-client\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862730 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-encryption-config\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862757 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.862771 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rx6\" (UniqueName: \"kubernetes.io/projected/06629698-b5a8-41a6-b94b-771abc920e20-kube-api-access-88rx6\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.864649 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.865113 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.865248 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.865382 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.865492 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.865637 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.872254 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.872657 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.872664 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.874320 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.875863 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.876179 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.876390 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.877729 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.877911 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.883353 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.883608 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.883769 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.884002 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.884148 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.884299 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.884397 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.884473 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.884782 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.884953 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885006 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885137 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885170 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885241 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885322 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885339 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885472 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885714 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.885140 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.887437 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.887540 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.888764 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.888957 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xzg25"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.888970 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.889061 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.889138 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.889342 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.889716 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.889841 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.890080 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.890682 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lpmsm"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.891002 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.891276 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.891332 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.891667 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.896244 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.896510 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.896896 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.898924 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.899417 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.901264 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.902381 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.913000 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.914095 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.915578 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-827fw"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.921669 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.922463 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.924511 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.924975 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.933262 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.933780 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.934116 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.934499 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.937757 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fqctr"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.939645 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.940266 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.940502 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.941127 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.941891 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.943746 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2xx98"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.944019 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.944818 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.945570 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.945675 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-std2v"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.945868 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.946717 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.946850 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.947670 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.948014 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-472l6"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.948834 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.949484 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-926pr"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.950015 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.953017 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.956485 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.957720 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.957845 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.958372 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.958902 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.959365 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.960166 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.960522 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.961303 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.961918 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963248 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-etcd-client\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963286 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-encryption-config\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963345 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88rx6\" (UniqueName: \"kubernetes.io/projected/06629698-b5a8-41a6-b94b-771abc920e20-kube-api-access-88rx6\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963378 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963447 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5twpb\" (UniqueName: \"kubernetes.io/projected/78cf6292-f923-40a8-9f4e-183d70e31a7f-kube-api-access-5twpb\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963493 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1328387a-a550-49b5-92ce-7019cb401bfb-machine-approver-tls\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963529 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963572 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06629698-b5a8-41a6-b94b-771abc920e20-audit-dir\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963590 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1328387a-a550-49b5-92ce-7019cb401bfb-auth-proxy-config\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963606 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78cf6292-f923-40a8-9f4e-183d70e31a7f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963622 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1328387a-a550-49b5-92ce-7019cb401bfb-config\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963674 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-audit-policies\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963697 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78cf6292-f923-40a8-9f4e-183d70e31a7f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963734 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-serving-cert\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963764 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tggt\" (UniqueName: \"kubernetes.io/projected/1328387a-a550-49b5-92ce-7019cb401bfb-kube-api-access-5tggt\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.963814 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78cf6292-f923-40a8-9f4e-183d70e31a7f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.964680 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-q54z4"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.965276 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.965612 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06629698-b5a8-41a6-b94b-771abc920e20-audit-dir\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.967356 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1328387a-a550-49b5-92ce-7019cb401bfb-config\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.967382 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-audit-policies\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.967583 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1328387a-a550-49b5-92ce-7019cb401bfb-auth-proxy-config\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.967895 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.968617 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06629698-b5a8-41a6-b94b-771abc920e20-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.969645 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78cf6292-f923-40a8-9f4e-183d70e31a7f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.970441 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-etcd-client\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.971454 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-encryption-config\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.972861 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06629698-b5a8-41a6-b94b-771abc920e20-serving-cert\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.973882 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78cf6292-f923-40a8-9f4e-183d70e31a7f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.975886 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1328387a-a550-49b5-92ce-7019cb401bfb-machine-approver-tls\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.976770 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wqskh"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.979610 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.979635 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8nfjc"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.980266 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bz89h"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.981267 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tfh9j"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.982403 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pgskl"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.983637 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r4655"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.984493 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.985999 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.986916 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hxb77"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.987388 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.988918 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.990773 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.991865 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.993146 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.995649 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr"] Dec 01 10:33:17 crc kubenswrapper[4761]: I1201 10:33:17.997668 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.001983 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.007368 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5s745"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.013777 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.015956 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-827fw"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.017237 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.018837 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.019614 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.020546 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zlgvt"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.021409 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zlgvt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.021655 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-npj9f"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.022802 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.022891 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.024105 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.025160 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-std2v"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.026717 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-472l6"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.027873 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-npj9f"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.029050 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.030265 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2xx98"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.031419 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zlgvt"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.032929 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.033053 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-926pr"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.034437 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.035619 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-q54z4"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.036742 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82k6m"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.037880 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xf5wg"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.038665 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.039134 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xf5wg"] Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.052855 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.072721 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.093184 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.113348 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.127362 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.127383 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.127431 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.127649 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.134210 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.153162 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.173778 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.193790 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.212719 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.232884 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.273097 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.293827 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.315157 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.334047 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.354620 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.373811 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.394977 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.414343 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.434151 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.454518 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.475206 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.494388 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.514340 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.534179 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.558411 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.574319 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.594239 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.614420 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.633328 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.661776 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.673665 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.704031 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.713685 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.734379 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.753941 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.773412 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.794370 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.814840 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.833969 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.854323 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.874803 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.893759 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.914374 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.932774 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.952316 4761 request.go:700] Waited for 1.006159883s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.954370 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.974210 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 10:33:18 crc kubenswrapper[4761]: I1201 10:33:18.993461 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.035651 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.044192 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.055092 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.074267 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.094090 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.114032 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.134585 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.155641 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.174297 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.195909 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.213393 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.234319 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.253598 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.273410 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.293444 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.313121 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.333197 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.354661 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.373627 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.394397 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.415236 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.434821 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.454465 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.475761 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.495037 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.513142 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.533901 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.571791 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78cf6292-f923-40a8-9f4e-183d70e31a7f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.574497 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.594384 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.638075 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5twpb\" (UniqueName: \"kubernetes.io/projected/78cf6292-f923-40a8-9f4e-183d70e31a7f-kube-api-access-5twpb\") pod \"cluster-image-registry-operator-dc59b4c8b-7gxtn\" (UID: \"78cf6292-f923-40a8-9f4e-183d70e31a7f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.658531 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rx6\" (UniqueName: \"kubernetes.io/projected/06629698-b5a8-41a6-b94b-771abc920e20-kube-api-access-88rx6\") pod \"apiserver-7bbb656c7d-7mkxr\" (UID: \"06629698-b5a8-41a6-b94b-771abc920e20\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.671870 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tggt\" (UniqueName: \"kubernetes.io/projected/1328387a-a550-49b5-92ce-7019cb401bfb-kube-api-access-5tggt\") pod \"machine-approver-56656f9798-c9tst\" (UID: \"1328387a-a550-49b5-92ce-7019cb401bfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.682705 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-tls\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.682763 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdcz\" (UniqueName: \"kubernetes.io/projected/0de6067f-4bc2-4265-bb7f-e595f6060033-kube-api-access-kvdcz\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.682801 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.682833 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5085aee7-8987-489e-86af-3c11f1a6618d-images\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.682865 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-client\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.682894 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-trusted-ca-bundle\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.682964 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trssg\" (UniqueName: \"kubernetes.io/projected/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-kube-api-access-trssg\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683029 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-config\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683078 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkhn4\" (UniqueName: \"kubernetes.io/projected/906c8f55-3191-4b35-a7d2-80fd512d5c34-kube-api-access-fkhn4\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683135 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmjt5\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-kube-api-access-kmjt5\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683166 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-config\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683196 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-trusted-ca\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683240 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683284 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-config\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683321 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-serving-cert\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683367 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjhl8\" (UniqueName: \"kubernetes.io/projected/5085aee7-8987-489e-86af-3c11f1a6618d-kube-api-access-wjhl8\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683385 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-client-ca\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683401 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683419 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-service-ca\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683436 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-certificates\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683450 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5615f9d-052a-4910-8050-d39d2d9dde06-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683463 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de6067f-4bc2-4265-bb7f-e595f6060033-console-oauth-config\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683479 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc2dk\" (UniqueName: \"kubernetes.io/projected/1aa5c7a7-c270-4c62-b054-88a85fbfc8b9-kube-api-access-wc2dk\") pod \"dns-operator-744455d44c-8nfjc\" (UID: \"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683499 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/873ee65e-5320-4949-8caa-893b41061408-audit-dir\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683517 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de6067f-4bc2-4265-bb7f-e595f6060033-console-serving-cert\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683533 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-config\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683586 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683601 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3454eba8-593e-4647-8b4b-71e0f432ffeb-audit-dir\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683619 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7wbb\" (UniqueName: \"kubernetes.io/projected/33f5c4fc-08a4-4683-ab53-e20612b27d02-kube-api-access-g7wbb\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683635 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5085aee7-8987-489e-86af-3c11f1a6618d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683651 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-client-ca\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683664 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lb99\" (UniqueName: \"kubernetes.io/projected/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-kube-api-access-9lb99\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683690 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-audit-policies\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683716 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-encryption-config\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683735 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1aa5c7a7-c270-4c62-b054-88a85fbfc8b9-metrics-tls\") pod \"dns-operator-744455d44c-8nfjc\" (UID: \"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683767 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-trusted-ca\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683785 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-bound-sa-token\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683804 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kct62\" (UniqueName: \"kubernetes.io/projected/3454eba8-593e-4647-8b4b-71e0f432ffeb-kube-api-access-kct62\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683836 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5085aee7-8987-489e-86af-3c11f1a6618d-config\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683852 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683872 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683890 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-serving-cert\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683904 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-oauth-serving-cert\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683920 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683935 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e7458e-68c1-4b57-a6f5-43eed3453e64-serving-cert\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683949 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-etcd-client\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683965 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f5c4fc-08a4-4683-ab53-e20612b27d02-serving-cert\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.683991 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5615f9d-052a-4910-8050-d39d2d9dde06-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684005 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-ca\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684019 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-service-ca\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684036 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52d03758-6fb1-4040-ae86-d2a89d6cc88f-serving-cert\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684050 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4wzb\" (UniqueName: \"kubernetes.io/projected/52d03758-6fb1-4040-ae86-d2a89d6cc88f-kube-api-access-r4wzb\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684064 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684077 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-console-config\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684091 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684107 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkd7v\" (UniqueName: \"kubernetes.io/projected/873ee65e-5320-4949-8caa-893b41061408-kube-api-access-fkd7v\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684123 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-service-ca-bundle\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684140 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-config\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684155 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzgtb\" (UniqueName: \"kubernetes.io/projected/7713e7f9-1a0c-448b-9814-c143fdd040ec-kube-api-access-lzgtb\") pod \"downloads-7954f5f757-fqctr\" (UID: \"7713e7f9-1a0c-448b-9814-c143fdd040ec\") " pod="openshift-console/downloads-7954f5f757-fqctr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684171 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqn4p\" (UniqueName: \"kubernetes.io/projected/e735de1a-2c56-45b3-b091-33ef92a3b119-kube-api-access-lqn4p\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684189 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684210 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/906c8f55-3191-4b35-a7d2-80fd512d5c34-serving-cert\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684234 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-config\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684250 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85899\" (UniqueName: \"kubernetes.io/projected/f2e7458e-68c1-4b57-a6f5-43eed3453e64-kube-api-access-85899\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684265 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684281 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684299 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684314 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/906c8f55-3191-4b35-a7d2-80fd512d5c34-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684329 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684345 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684360 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3454eba8-593e-4647-8b4b-71e0f432ffeb-node-pullsecrets\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684374 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-image-import-ca\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684390 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e735de1a-2c56-45b3-b091-33ef92a3b119-serving-cert\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684414 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684430 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-etcd-serving-ca\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684448 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.684461 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-audit\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: E1201 10:33:19.684941 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.184927867 +0000 UTC m=+139.488686491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.693341 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.714099 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.733730 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.754083 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.755316 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.774402 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 10:33:19 crc kubenswrapper[4761]: W1201 10:33:19.775194 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1328387a_a550_49b5_92ce_7019cb401bfb.slice/crio-29c47280364f14212928da11be065962298e105a3e9e0aa13bfd605e99205d06 WatchSource:0}: Error finding container 29c47280364f14212928da11be065962298e105a3e9e0aa13bfd605e99205d06: Status 404 returned error can't find the container with id 29c47280364f14212928da11be065962298e105a3e9e0aa13bfd605e99205d06 Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.785687 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.785819 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.785846 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b62db57-21d8-498f-9d27-8030bc510076-images\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.785868 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcjqm\" (UniqueName: \"kubernetes.io/projected/8efe62a3-ec31-4144-8d34-150502a96362-kube-api-access-xcjqm\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.785895 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de6067f-4bc2-4265-bb7f-e595f6060033-console-serving-cert\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: E1201 10:33:19.785953 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.285921003 +0000 UTC m=+139.589679667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786036 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3454eba8-593e-4647-8b4b-71e0f432ffeb-audit-dir\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786090 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82353b64-ab8c-431e-8732-ba585bd9cc95-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786129 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7wbb\" (UniqueName: \"kubernetes.io/projected/33f5c4fc-08a4-4683-ab53-e20612b27d02-kube-api-access-g7wbb\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786161 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/00283b92-e322-4923-8631-2f77c33b8993-certs\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786196 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1aa5c7a7-c270-4c62-b054-88a85fbfc8b9-metrics-tls\") pod \"dns-operator-744455d44c-8nfjc\" (UID: \"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786229 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-audit-policies\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786262 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-bound-sa-token\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786295 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kct62\" (UniqueName: \"kubernetes.io/projected/3454eba8-593e-4647-8b4b-71e0f432ffeb-kube-api-access-kct62\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786328 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-config\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786363 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbtr\" (UniqueName: \"kubernetes.io/projected/10e2d759-27e8-4e8f-8d6b-86817b091df5-kube-api-access-hdbtr\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786393 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3ce8fad-1931-4034-96f9-6b9750665a36-webhook-cert\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786441 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-metrics-certs\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786478 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a2762223-c499-429c-814a-00ead7b447d8-signing-key\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786487 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3454eba8-593e-4647-8b4b-71e0f432ffeb-audit-dir\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786511 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-registration-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786580 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l86t6\" (UniqueName: \"kubernetes.io/projected/0a304a57-2fa4-477c-8d57-4f411e4f8790-kube-api-access-l86t6\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786619 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-serving-cert\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786652 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxfpg\" (UniqueName: \"kubernetes.io/projected/00283b92-e322-4923-8631-2f77c33b8993-kube-api-access-wxfpg\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786687 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786737 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e7458e-68c1-4b57-a6f5-43eed3453e64-serving-cert\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786794 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-etcd-client\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786835 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjf9m\" (UniqueName: \"kubernetes.io/projected/a2762223-c499-429c-814a-00ead7b447d8-kube-api-access-rjf9m\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786868 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-default-certificate\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.786917 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-service-ca-bundle\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.787121 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.787951 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-audit-policies\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788165 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnctx\" (UniqueName: \"kubernetes.io/projected/7886c492-0b69-4cb1-aef7-08e7e482bc6a-kube-api-access-hnctx\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788237 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-ca\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788295 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-service-ca\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788305 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788328 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788367 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-console-config\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788452 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788481 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/00283b92-e322-4923-8631-2f77c33b8993-node-bootstrap-token\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788506 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddd9b688-a86b-4d31-b3f6-eb674a12d438-cert\") pod \"ingress-canary-zlgvt\" (UID: \"ddd9b688-a86b-4d31-b3f6-eb674a12d438\") " pod="openshift-ingress-canary/ingress-canary-zlgvt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.788966 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5615f9d-052a-4910-8050-d39d2d9dde06-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789046 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8efe62a3-ec31-4144-8d34-150502a96362-profile-collector-cert\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789150 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436b751c-ff5f-4b20-b63c-9960d2bebfb5-config\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789262 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkd7v\" (UniqueName: \"kubernetes.io/projected/873ee65e-5320-4949-8caa-893b41061408-kube-api-access-fkd7v\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789290 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-stats-auth\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789352 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-config\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789424 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzgtb\" (UniqueName: \"kubernetes.io/projected/7713e7f9-1a0c-448b-9814-c143fdd040ec-kube-api-access-lzgtb\") pod \"downloads-7954f5f757-fqctr\" (UID: \"7713e7f9-1a0c-448b-9814-c143fdd040ec\") " pod="openshift-console/downloads-7954f5f757-fqctr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789472 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flrxm\" (UniqueName: \"kubernetes.io/projected/e3ce8fad-1931-4034-96f9-6b9750665a36-kube-api-access-flrxm\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789499 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789583 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c36a280-252a-48bd-a64d-be969429a43d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789597 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-service-ca\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789614 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a304a57-2fa4-477c-8d57-4f411e4f8790-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789530 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-ca\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789645 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh4th\" (UniqueName: \"kubernetes.io/projected/0373f01a-1b29-45f1-a72b-f96dbfb5e359-kube-api-access-nh4th\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789860 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85899\" (UniqueName: \"kubernetes.io/projected/f2e7458e-68c1-4b57-a6f5-43eed3453e64-kube-api-access-85899\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789938 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789962 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5615f9d-052a-4910-8050-d39d2d9dde06-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.789966 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/906c8f55-3191-4b35-a7d2-80fd512d5c34-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790116 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790168 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e3ce8fad-1931-4034-96f9-6b9750665a36-tmpfs\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790219 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790349 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3454eba8-593e-4647-8b4b-71e0f432ffeb-node-pullsecrets\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790403 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-config\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790447 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/906c8f55-3191-4b35-a7d2-80fd512d5c34-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790642 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790664 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.790994 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3454eba8-593e-4647-8b4b-71e0f432ffeb-node-pullsecrets\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.791043 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.791112 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e7458e-68c1-4b57-a6f5-43eed3453e64-serving-cert\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.791444 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-console-config\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.791582 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-image-import-ca\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.791732 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-metrics-tls\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.791870 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-etcd-serving-ca\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.791921 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbrll\" (UniqueName: \"kubernetes.io/projected/49c0687d-9489-4429-9ef2-09f82f7df268-kube-api-access-gbrll\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.791967 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e456cc-7c02-479e-a278-af630a5dfd6f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792014 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792059 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b62db57-21d8-498f-9d27-8030bc510076-auth-proxy-config\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792110 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9161413-554f-4d53-bc23-efd48ff91a94-config-volume\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792153 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-mountpoint-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792196 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-tls\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792222 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792241 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-plugins-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792259 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-client\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792274 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-trusted-ca-bundle\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792292 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxl5g\" (UniqueName: \"kubernetes.io/projected/6932484c-2cc4-42a6-816f-c368946e0a29-kube-api-access-fxl5g\") pod \"package-server-manager-789f6589d5-852hr\" (UID: \"6932484c-2cc4-42a6-816f-c368946e0a29\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792313 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0373f01a-1b29-45f1-a72b-f96dbfb5e359-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792330 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9ddp\" (UniqueName: \"kubernetes.io/projected/1c36a280-252a-48bd-a64d-be969429a43d-kube-api-access-q9ddp\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792369 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-config\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792385 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792400 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-config\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792415 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-serving-cert\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792419 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-etcd-client\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792432 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-client-ca\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792530 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-service-ca\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792607 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65e456cc-7c02-479e-a278-af630a5dfd6f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792648 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-certificates\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792687 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc2dk\" (UniqueName: \"kubernetes.io/projected/1aa5c7a7-c270-4c62-b054-88a85fbfc8b9-kube-api-access-wc2dk\") pod \"dns-operator-744455d44c-8nfjc\" (UID: \"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792724 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3ce8fad-1931-4034-96f9-6b9750665a36-apiservice-cert\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792760 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/873ee65e-5320-4949-8caa-893b41061408-audit-dir\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792792 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-config\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792828 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a304a57-2fa4-477c-8d57-4f411e4f8790-proxy-tls\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792862 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29t2\" (UniqueName: \"kubernetes.io/projected/0b62db57-21d8-498f-9d27-8030bc510076-kube-api-access-l29t2\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792898 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792939 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5085aee7-8987-489e-86af-3c11f1a6618d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.792972 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6v8\" (UniqueName: \"kubernetes.io/projected/4a70f5c2-aba5-46bb-a96b-da503d30e66e-kube-api-access-np6v8\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793030 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-client-ca\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793064 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lb99\" (UniqueName: \"kubernetes.io/projected/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-kube-api-access-9lb99\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793085 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-client-ca\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793097 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-encryption-config\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793129 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9161413-554f-4d53-bc23-efd48ff91a94-secret-volume\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793163 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-trusted-ca\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793180 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-image-import-ca\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793195 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5085aee7-8987-489e-86af-3c11f1a6618d-config\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793314 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793354 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b62db57-21d8-498f-9d27-8030bc510076-proxy-tls\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793395 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793476 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-oauth-serving-cert\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793515 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793533 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-etcd-serving-ca\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793583 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/436b751c-ff5f-4b20-b63c-9960d2bebfb5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793642 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793687 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gswnm\" (UniqueName: \"kubernetes.io/projected/49f94e97-89ed-41ca-b0c1-620d9e69ae81-kube-api-access-gswnm\") pod \"control-plane-machine-set-operator-78cbb6b69f-lwd6m\" (UID: \"49f94e97-89ed-41ca-b0c1-620d9e69ae81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793726 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793767 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1827035-d23f-4436-96ee-f363b9ea9022-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xvpkl\" (UID: \"f1827035-d23f-4436-96ee-f363b9ea9022\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793846 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6932484c-2cc4-42a6-816f-c368946e0a29-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-852hr\" (UID: \"6932484c-2cc4-42a6-816f-c368946e0a29\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793920 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/49f94e97-89ed-41ca-b0c1-620d9e69ae81-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lwd6m\" (UID: \"49f94e97-89ed-41ca-b0c1-620d9e69ae81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.793961 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/373f0364-84dc-446c-87fa-bb03f4bf1baf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-q54z4\" (UID: \"373f0364-84dc-446c-87fa-bb03f4bf1baf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.794021 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.794036 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f5c4fc-08a4-4683-ab53-e20612b27d02-serving-cert\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.794133 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52d03758-6fb1-4040-ae86-d2a89d6cc88f-serving-cert\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.794210 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4wzb\" (UniqueName: \"kubernetes.io/projected/52d03758-6fb1-4040-ae86-d2a89d6cc88f-kube-api-access-r4wzb\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.794282 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e2d759-27e8-4e8f-8d6b-86817b091df5-serving-cert\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.794376 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1aa5c7a7-c270-4c62-b054-88a85fbfc8b9-metrics-tls\") pod \"dns-operator-744455d44c-8nfjc\" (UID: \"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.794384 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-service-ca-bundle\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.794416 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5085aee7-8987-489e-86af-3c11f1a6618d-config\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.796167 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqn4p\" (UniqueName: \"kubernetes.io/projected/e735de1a-2c56-45b3-b091-33ef92a3b119-kube-api-access-lqn4p\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.797066 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.797241 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/873ee65e-5320-4949-8caa-893b41061408-audit-dir\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.797603 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.798365 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-serving-cert\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802012 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802062 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-config\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802391 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802527 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgxql\" (UniqueName: \"kubernetes.io/projected/f1827035-d23f-4436-96ee-f363b9ea9022-kube-api-access-jgxql\") pod \"cluster-samples-operator-665b6dd947-xvpkl\" (UID: \"f1827035-d23f-4436-96ee-f363b9ea9022\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802647 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a2762223-c499-429c-814a-00ead7b447d8-signing-cabundle\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802707 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mjfh\" (UniqueName: \"kubernetes.io/projected/373f0364-84dc-446c-87fa-bb03f4bf1baf-kube-api-access-4mjfh\") pod \"multus-admission-controller-857f4d67dd-q54z4\" (UID: \"373f0364-84dc-446c-87fa-bb03f4bf1baf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802788 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de6067f-4bc2-4265-bb7f-e595f6060033-console-serving-cert\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802804 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8efe62a3-ec31-4144-8d34-150502a96362-srv-cert\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802864 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49c0687d-9489-4429-9ef2-09f82f7df268-trusted-ca\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802935 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqw44\" (UniqueName: \"kubernetes.io/projected/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-kube-api-access-fqw44\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802994 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e2d759-27e8-4e8f-8d6b-86817b091df5-config\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.802788 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803030 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-config\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803191 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/906c8f55-3191-4b35-a7d2-80fd512d5c34-serving-cert\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803239 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-csi-data-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803456 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-config\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803486 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-trusted-ca-bundle\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803505 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803614 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803674 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbsq\" (UniqueName: \"kubernetes.io/projected/82353b64-ab8c-431e-8732-ba585bd9cc95-kube-api-access-cvbsq\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803804 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-socket-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.803876 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65e456cc-7c02-479e-a278-af630a5dfd6f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.804397 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-oauth-serving-cert\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.804643 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-config\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.804740 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8bcm\" (UniqueName: \"kubernetes.io/projected/c9161413-554f-4d53-bc23-efd48ff91a94-kube-api-access-s8bcm\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.804873 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e7458e-68c1-4b57-a6f5-43eed3453e64-config\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.805845 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-trusted-ca\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.805901 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e735de1a-2c56-45b3-b091-33ef92a3b119-serving-cert\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.805926 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e735de1a-2c56-45b3-b091-33ef92a3b119-service-ca-bundle\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.806436 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.806535 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49c0687d-9489-4429-9ef2-09f82f7df268-metrics-tls\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.806826 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.806904 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/436b751c-ff5f-4b20-b63c-9960d2bebfb5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.806920 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5085aee7-8987-489e-86af-3c11f1a6618d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.807024 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-audit\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.807059 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.807084 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49c0687d-9489-4429-9ef2-09f82f7df268-bound-sa-token\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.807629 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/906c8f55-3191-4b35-a7d2-80fd512d5c34-serving-cert\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:19 crc kubenswrapper[4761]: E1201 10:33:19.807662 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.307637628 +0000 UTC m=+139.611396292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.807998 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808124 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3454eba8-593e-4647-8b4b-71e0f432ffeb-audit\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808169 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdcz\" (UniqueName: \"kubernetes.io/projected/0de6067f-4bc2-4265-bb7f-e595f6060033-kube-api-access-kvdcz\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808286 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c36a280-252a-48bd-a64d-be969429a43d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808504 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2f4q\" (UniqueName: \"kubernetes.io/projected/ddd9b688-a86b-4d31-b3f6-eb674a12d438-kube-api-access-w2f4q\") pod \"ingress-canary-zlgvt\" (UID: \"ddd9b688-a86b-4d31-b3f6-eb674a12d438\") " pod="openshift-ingress-canary/ingress-canary-zlgvt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808544 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z6wf\" (UniqueName: \"kubernetes.io/projected/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-kube-api-access-7z6wf\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808595 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5085aee7-8987-489e-86af-3c11f1a6618d-images\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808800 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-config\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808896 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trssg\" (UniqueName: \"kubernetes.io/projected/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-kube-api-access-trssg\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.808983 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e735de1a-2c56-45b3-b091-33ef92a3b119-serving-cert\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.809255 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkhn4\" (UniqueName: \"kubernetes.io/projected/906c8f55-3191-4b35-a7d2-80fd512d5c34-kube-api-access-fkhn4\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.809343 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmjt5\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-kube-api-access-kmjt5\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.809804 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5085aee7-8987-489e-86af-3c11f1a6618d-images\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.810386 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-tls\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.811579 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-config\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813004 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-trusted-ca\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813078 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82353b64-ab8c-431e-8732-ba585bd9cc95-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813126 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-config-volume\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813159 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0373f01a-1b29-45f1-a72b-f96dbfb5e359-srv-cert\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813225 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813268 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjhl8\" (UniqueName: \"kubernetes.io/projected/5085aee7-8987-489e-86af-3c11f1a6618d-kube-api-access-wjhl8\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813360 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5615f9d-052a-4910-8050-d39d2d9dde06-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813387 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52d03758-6fb1-4040-ae86-d2a89d6cc88f-serving-cert\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813402 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de6067f-4bc2-4265-bb7f-e595f6060033-console-oauth-config\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813448 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbcnt\" (UniqueName: \"kubernetes.io/projected/8555dd96-901c-4ef4-b63b-816d54e1489b-kube-api-access-dbcnt\") pod \"migrator-59844c95c7-k46jr\" (UID: \"8555dd96-901c-4ef4-b63b-816d54e1489b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.813616 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.814468 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.815063 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de6067f-4bc2-4265-bb7f-e595f6060033-service-ca\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.815592 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-client-ca\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.816127 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.816360 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-trusted-ca\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.816938 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-encryption-config\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.817078 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.817343 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f5c4fc-08a4-4683-ab53-e20612b27d02-serving-cert\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.818774 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2e7458e-68c1-4b57-a6f5-43eed3453e64-etcd-client\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.818894 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5615f9d-052a-4910-8050-d39d2d9dde06-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.819504 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3454eba8-593e-4647-8b4b-71e0f432ffeb-serving-cert\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.820327 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.820438 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-certificates\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.820899 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de6067f-4bc2-4265-bb7f-e595f6060033-console-oauth-config\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.835160 4761 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.855162 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.874119 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.893470 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.893831 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" event={"ID":"1328387a-a550-49b5-92ce-7019cb401bfb","Type":"ContainerStarted","Data":"29c47280364f14212928da11be065962298e105a3e9e0aa13bfd605e99205d06"} Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914133 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914517 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:19 crc kubenswrapper[4761]: E1201 10:33:19.914679 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.414660504 +0000 UTC m=+139.718419128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914745 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65e456cc-7c02-479e-a278-af630a5dfd6f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914810 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3ce8fad-1931-4034-96f9-6b9750665a36-apiservice-cert\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914835 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a304a57-2fa4-477c-8d57-4f411e4f8790-proxy-tls\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914876 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l29t2\" (UniqueName: \"kubernetes.io/projected/0b62db57-21d8-498f-9d27-8030bc510076-kube-api-access-l29t2\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914901 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6v8\" (UniqueName: \"kubernetes.io/projected/4a70f5c2-aba5-46bb-a96b-da503d30e66e-kube-api-access-np6v8\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914963 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9161413-554f-4d53-bc23-efd48ff91a94-secret-volume\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.914987 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b62db57-21d8-498f-9d27-8030bc510076-proxy-tls\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915032 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/436b751c-ff5f-4b20-b63c-9960d2bebfb5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915051 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915071 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gswnm\" (UniqueName: \"kubernetes.io/projected/49f94e97-89ed-41ca-b0c1-620d9e69ae81-kube-api-access-gswnm\") pod \"control-plane-machine-set-operator-78cbb6b69f-lwd6m\" (UID: \"49f94e97-89ed-41ca-b0c1-620d9e69ae81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915107 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915124 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1827035-d23f-4436-96ee-f363b9ea9022-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xvpkl\" (UID: \"f1827035-d23f-4436-96ee-f363b9ea9022\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915143 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6932484c-2cc4-42a6-816f-c368946e0a29-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-852hr\" (UID: \"6932484c-2cc4-42a6-816f-c368946e0a29\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915177 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/49f94e97-89ed-41ca-b0c1-620d9e69ae81-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lwd6m\" (UID: \"49f94e97-89ed-41ca-b0c1-620d9e69ae81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915195 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/373f0364-84dc-446c-87fa-bb03f4bf1baf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-q54z4\" (UID: \"373f0364-84dc-446c-87fa-bb03f4bf1baf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915218 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e2d759-27e8-4e8f-8d6b-86817b091df5-serving-cert\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915266 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mjfh\" (UniqueName: \"kubernetes.io/projected/373f0364-84dc-446c-87fa-bb03f4bf1baf-kube-api-access-4mjfh\") pod \"multus-admission-controller-857f4d67dd-q54z4\" (UID: \"373f0364-84dc-446c-87fa-bb03f4bf1baf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915282 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8efe62a3-ec31-4144-8d34-150502a96362-srv-cert\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915304 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgxql\" (UniqueName: \"kubernetes.io/projected/f1827035-d23f-4436-96ee-f363b9ea9022-kube-api-access-jgxql\") pod \"cluster-samples-operator-665b6dd947-xvpkl\" (UID: \"f1827035-d23f-4436-96ee-f363b9ea9022\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915336 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a2762223-c499-429c-814a-00ead7b447d8-signing-cabundle\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915353 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e2d759-27e8-4e8f-8d6b-86817b091df5-config\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915371 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49c0687d-9489-4429-9ef2-09f82f7df268-trusted-ca\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915387 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqw44\" (UniqueName: \"kubernetes.io/projected/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-kube-api-access-fqw44\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915403 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-csi-data-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915423 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbsq\" (UniqueName: \"kubernetes.io/projected/82353b64-ab8c-431e-8732-ba585bd9cc95-kube-api-access-cvbsq\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915448 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8bcm\" (UniqueName: \"kubernetes.io/projected/c9161413-554f-4d53-bc23-efd48ff91a94-kube-api-access-s8bcm\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915465 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-socket-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915483 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65e456cc-7c02-479e-a278-af630a5dfd6f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915500 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49c0687d-9489-4429-9ef2-09f82f7df268-metrics-tls\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915519 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915535 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/436b751c-ff5f-4b20-b63c-9960d2bebfb5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915563 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915578 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49c0687d-9489-4429-9ef2-09f82f7df268-bound-sa-token\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915594 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2f4q\" (UniqueName: \"kubernetes.io/projected/ddd9b688-a86b-4d31-b3f6-eb674a12d438-kube-api-access-w2f4q\") pod \"ingress-canary-zlgvt\" (UID: \"ddd9b688-a86b-4d31-b3f6-eb674a12d438\") " pod="openshift-ingress-canary/ingress-canary-zlgvt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915618 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c36a280-252a-48bd-a64d-be969429a43d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915634 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z6wf\" (UniqueName: \"kubernetes.io/projected/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-kube-api-access-7z6wf\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915661 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-config-volume\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915676 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0373f01a-1b29-45f1-a72b-f96dbfb5e359-srv-cert\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915703 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82353b64-ab8c-431e-8732-ba585bd9cc95-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915726 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbcnt\" (UniqueName: \"kubernetes.io/projected/8555dd96-901c-4ef4-b63b-816d54e1489b-kube-api-access-dbcnt\") pod \"migrator-59844c95c7-k46jr\" (UID: \"8555dd96-901c-4ef4-b63b-816d54e1489b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915745 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915759 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b62db57-21d8-498f-9d27-8030bc510076-images\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915775 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcjqm\" (UniqueName: \"kubernetes.io/projected/8efe62a3-ec31-4144-8d34-150502a96362-kube-api-access-xcjqm\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915790 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82353b64-ab8c-431e-8732-ba585bd9cc95-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915808 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/00283b92-e322-4923-8631-2f77c33b8993-certs\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915833 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbtr\" (UniqueName: \"kubernetes.io/projected/10e2d759-27e8-4e8f-8d6b-86817b091df5-kube-api-access-hdbtr\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915848 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3ce8fad-1931-4034-96f9-6b9750665a36-webhook-cert\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915872 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-config\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915886 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-metrics-certs\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915909 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-registration-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915927 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a2762223-c499-429c-814a-00ead7b447d8-signing-key\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915943 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l86t6\" (UniqueName: \"kubernetes.io/projected/0a304a57-2fa4-477c-8d57-4f411e4f8790-kube-api-access-l86t6\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915958 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxfpg\" (UniqueName: \"kubernetes.io/projected/00283b92-e322-4923-8631-2f77c33b8993-kube-api-access-wxfpg\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915976 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjf9m\" (UniqueName: \"kubernetes.io/projected/a2762223-c499-429c-814a-00ead7b447d8-kube-api-access-rjf9m\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.915990 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-default-certificate\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916003 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-service-ca-bundle\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916022 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnctx\" (UniqueName: \"kubernetes.io/projected/7886c492-0b69-4cb1-aef7-08e7e482bc6a-kube-api-access-hnctx\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916039 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/00283b92-e322-4923-8631-2f77c33b8993-node-bootstrap-token\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916053 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddd9b688-a86b-4d31-b3f6-eb674a12d438-cert\") pod \"ingress-canary-zlgvt\" (UID: \"ddd9b688-a86b-4d31-b3f6-eb674a12d438\") " pod="openshift-ingress-canary/ingress-canary-zlgvt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916071 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436b751c-ff5f-4b20-b63c-9960d2bebfb5-config\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916085 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8efe62a3-ec31-4144-8d34-150502a96362-profile-collector-cert\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916109 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-stats-auth\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916128 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flrxm\" (UniqueName: \"kubernetes.io/projected/e3ce8fad-1931-4034-96f9-6b9750665a36-kube-api-access-flrxm\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916148 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c36a280-252a-48bd-a64d-be969429a43d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916164 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a304a57-2fa4-477c-8d57-4f411e4f8790-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916180 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh4th\" (UniqueName: \"kubernetes.io/projected/0373f01a-1b29-45f1-a72b-f96dbfb5e359-kube-api-access-nh4th\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916202 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e3ce8fad-1931-4034-96f9-6b9750665a36-tmpfs\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916218 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-metrics-tls\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916236 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e456cc-7c02-479e-a278-af630a5dfd6f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916254 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbrll\" (UniqueName: \"kubernetes.io/projected/49c0687d-9489-4429-9ef2-09f82f7df268-kube-api-access-gbrll\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916270 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b62db57-21d8-498f-9d27-8030bc510076-auth-proxy-config\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916285 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9161413-554f-4d53-bc23-efd48ff91a94-config-volume\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916299 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-mountpoint-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916316 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-plugins-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916332 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxl5g\" (UniqueName: \"kubernetes.io/projected/6932484c-2cc4-42a6-816f-c368946e0a29-kube-api-access-fxl5g\") pod \"package-server-manager-789f6589d5-852hr\" (UID: \"6932484c-2cc4-42a6-816f-c368946e0a29\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916351 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0373f01a-1b29-45f1-a72b-f96dbfb5e359-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.916374 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9ddp\" (UniqueName: \"kubernetes.io/projected/1c36a280-252a-48bd-a64d-be969429a43d-kube-api-access-q9ddp\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.919044 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.919366 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a304a57-2fa4-477c-8d57-4f411e4f8790-proxy-tls\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.919395 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b62db57-21d8-498f-9d27-8030bc510076-proxy-tls\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.919781 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3ce8fad-1931-4034-96f9-6b9750665a36-apiservice-cert\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.919928 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-registration-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.919947 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b62db57-21d8-498f-9d27-8030bc510076-images\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.920644 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/436b751c-ff5f-4b20-b63c-9960d2bebfb5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.921118 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436b751c-ff5f-4b20-b63c-9960d2bebfb5-config\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.923929 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82353b64-ab8c-431e-8732-ba585bd9cc95-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.924377 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.924633 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-socket-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.925839 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8efe62a3-ec31-4144-8d34-150502a96362-profile-collector-cert\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.926119 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1827035-d23f-4436-96ee-f363b9ea9022-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xvpkl\" (UID: \"f1827035-d23f-4436-96ee-f363b9ea9022\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.926922 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9161413-554f-4d53-bc23-efd48ff91a94-secret-volume\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.928438 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a2762223-c499-429c-814a-00ead7b447d8-signing-cabundle\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.929218 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-config-volume\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.929238 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e2d759-27e8-4e8f-8d6b-86817b091df5-config\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:19 crc kubenswrapper[4761]: E1201 10:33:19.929576 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.429547576 +0000 UTC m=+139.733306200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.930084 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c36a280-252a-48bd-a64d-be969429a43d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.930480 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49c0687d-9489-4429-9ef2-09f82f7df268-trusted-ca\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.930655 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-csi-data-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.931192 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6932484c-2cc4-42a6-816f-c368946e0a29-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-852hr\" (UID: \"6932484c-2cc4-42a6-816f-c368946e0a29\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.931213 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/00283b92-e322-4923-8631-2f77c33b8993-certs\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.931491 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c36a280-252a-48bd-a64d-be969429a43d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.931531 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-service-ca-bundle\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.931990 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82353b64-ab8c-431e-8732-ba585bd9cc95-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.932355 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b62db57-21d8-498f-9d27-8030bc510076-auth-proxy-config\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.932456 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e3ce8fad-1931-4034-96f9-6b9750665a36-tmpfs\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.933029 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddd9b688-a86b-4d31-b3f6-eb674a12d438-cert\") pod \"ingress-canary-zlgvt\" (UID: \"ddd9b688-a86b-4d31-b3f6-eb674a12d438\") " pod="openshift-ingress-canary/ingress-canary-zlgvt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.933097 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e456cc-7c02-479e-a278-af630a5dfd6f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.933215 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e2d759-27e8-4e8f-8d6b-86817b091df5-serving-cert\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.933478 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-plugins-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.933511 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.933701 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a2762223-c499-429c-814a-00ead7b447d8-signing-key\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.933810 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.934055 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4a70f5c2-aba5-46bb-a96b-da503d30e66e-mountpoint-dir\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.934202 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-config\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.934933 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65e456cc-7c02-479e-a278-af630a5dfd6f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.935036 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9161413-554f-4d53-bc23-efd48ff91a94-config-volume\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.935708 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a304a57-2fa4-477c-8d57-4f411e4f8790-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.936862 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.937982 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/00283b92-e322-4923-8631-2f77c33b8993-node-bootstrap-token\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.938255 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8efe62a3-ec31-4144-8d34-150502a96362-srv-cert\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.938347 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0373f01a-1b29-45f1-a72b-f96dbfb5e359-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.938419 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0373f01a-1b29-45f1-a72b-f96dbfb5e359-srv-cert\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.938618 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49c0687d-9489-4429-9ef2-09f82f7df268-metrics-tls\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.939198 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-default-certificate\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.939636 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3ce8fad-1931-4034-96f9-6b9750665a36-webhook-cert\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.939641 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/49f94e97-89ed-41ca-b0c1-620d9e69ae81-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lwd6m\" (UID: \"49f94e97-89ed-41ca-b0c1-620d9e69ae81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.939752 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-metrics-certs\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.940428 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-stats-auth\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.940716 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/373f0364-84dc-446c-87fa-bb03f4bf1baf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-q54z4\" (UID: \"373f0364-84dc-446c-87fa-bb03f4bf1baf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.946953 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-metrics-tls\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.953976 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.971749 4761 request.go:700] Waited for 1.843935787s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.973576 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 10:33:19 crc kubenswrapper[4761]: I1201 10:33:19.994710 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.013865 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.016906 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.017063 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.517040773 +0000 UTC m=+139.820799407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.017148 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.017429 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.517420404 +0000 UTC m=+139.821179028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.033664 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.042016 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn"] Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.053239 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 10:33:20 crc kubenswrapper[4761]: W1201 10:33:20.055872 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78cf6292_f923_40a8_9f4e_183d70e31a7f.slice/crio-6f592df11c7da65c7019d632a5a6a8740c74ca354ff5c6b329f87301afa4ae00 WatchSource:0}: Error finding container 6f592df11c7da65c7019d632a5a6a8740c74ca354ff5c6b329f87301afa4ae00: Status 404 returned error can't find the container with id 6f592df11c7da65c7019d632a5a6a8740c74ca354ff5c6b329f87301afa4ae00 Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.108074 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7wbb\" (UniqueName: \"kubernetes.io/projected/33f5c4fc-08a4-4683-ab53-e20612b27d02-kube-api-access-g7wbb\") pod \"route-controller-manager-6576b87f9c-bxmxl\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.116693 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.118893 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.119169 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.619131633 +0000 UTC m=+139.922890307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.119420 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.119768 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.619751181 +0000 UTC m=+139.923509815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.130494 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kct62\" (UniqueName: \"kubernetes.io/projected/3454eba8-593e-4647-8b4b-71e0f432ffeb-kube-api-access-kct62\") pod \"apiserver-76f77b778f-tfh9j\" (UID: \"3454eba8-593e-4647-8b4b-71e0f432ffeb\") " pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.158472 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-bound-sa-token\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.176994 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkd7v\" (UniqueName: \"kubernetes.io/projected/873ee65e-5320-4949-8caa-893b41061408-kube-api-access-fkd7v\") pod \"oauth-openshift-558db77b4-82k6m\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.189686 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzgtb\" (UniqueName: \"kubernetes.io/projected/7713e7f9-1a0c-448b-9814-c143fdd040ec-kube-api-access-lzgtb\") pod \"downloads-7954f5f757-fqctr\" (UID: \"7713e7f9-1a0c-448b-9814-c143fdd040ec\") " pod="openshift-console/downloads-7954f5f757-fqctr" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.210318 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85899\" (UniqueName: \"kubernetes.io/projected/f2e7458e-68c1-4b57-a6f5-43eed3453e64-kube-api-access-85899\") pod \"etcd-operator-b45778765-wqskh\" (UID: \"f2e7458e-68c1-4b57-a6f5-43eed3453e64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.221125 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.221256 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.721238863 +0000 UTC m=+140.024997487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.221425 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.221850 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.721842981 +0000 UTC m=+140.025601605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.226753 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc2dk\" (UniqueName: \"kubernetes.io/projected/1aa5c7a7-c270-4c62-b054-88a85fbfc8b9-kube-api-access-wc2dk\") pod \"dns-operator-744455d44c-8nfjc\" (UID: \"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.228762 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.245358 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lb99\" (UniqueName: \"kubernetes.io/projected/78b1b0de-2b28-45c2-a655-fe3edb1c72d8-kube-api-access-9lb99\") pod \"console-operator-58897d9998-r4655\" (UID: \"78b1b0de-2b28-45c2-a655-fe3edb1c72d8\") " pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.265446 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4wzb\" (UniqueName: \"kubernetes.io/projected/52d03758-6fb1-4040-ae86-d2a89d6cc88f-kube-api-access-r4wzb\") pod \"controller-manager-879f6c89f-pgskl\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.289106 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqn4p\" (UniqueName: \"kubernetes.io/projected/e735de1a-2c56-45b3-b091-33ef92a3b119-kube-api-access-lqn4p\") pod \"authentication-operator-69f744f599-v7r4l\" (UID: \"e735de1a-2c56-45b3-b091-33ef92a3b119\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.289679 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr"] Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.311136 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdcz\" (UniqueName: \"kubernetes.io/projected/0de6067f-4bc2-4265-bb7f-e595f6060033-kube-api-access-kvdcz\") pod \"console-f9d7485db-bz89h\" (UID: \"0de6067f-4bc2-4265-bb7f-e595f6060033\") " pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:20 crc kubenswrapper[4761]: W1201 10:33:20.320252 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06629698_b5a8_41a6_b94b_771abc920e20.slice/crio-4d2813243afae49c5454f4c4b976ac28d143b737b1af094088b5f45606f9f6f5 WatchSource:0}: Error finding container 4d2813243afae49c5454f4c4b976ac28d143b737b1af094088b5f45606f9f6f5: Status 404 returned error can't find the container with id 4d2813243afae49c5454f4c4b976ac28d143b737b1af094088b5f45606f9f6f5 Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.322239 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.322429 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.822403046 +0000 UTC m=+140.126161680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.322639 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.323006 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.822995903 +0000 UTC m=+140.126754537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.323188 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.334792 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trssg\" (UniqueName: \"kubernetes.io/projected/2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea-kube-api-access-trssg\") pod \"openshift-controller-manager-operator-756b6f6bc6-rdbjm\" (UID: \"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.341220 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fqctr" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.347644 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.355902 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmjt5\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-kube-api-access-kmjt5\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.388178 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkhn4\" (UniqueName: \"kubernetes.io/projected/906c8f55-3191-4b35-a7d2-80fd512d5c34-kube-api-access-fkhn4\") pod \"openshift-config-operator-7777fb866f-v6dsn\" (UID: \"906c8f55-3191-4b35-a7d2-80fd512d5c34\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.390802 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjhl8\" (UniqueName: \"kubernetes.io/projected/5085aee7-8987-489e-86af-3c11f1a6618d-kube-api-access-wjhl8\") pod \"machine-api-operator-5694c8668f-xzg25\" (UID: \"5085aee7-8987-489e-86af-3c11f1a6618d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.393171 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.402739 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.410800 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65e456cc-7c02-479e-a278-af630a5dfd6f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kjxwd\" (UID: \"65e456cc-7c02-479e-a278-af630a5dfd6f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.422613 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.423830 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.424434 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:20.924411773 +0000 UTC m=+140.228170387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.428904 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.440591 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29t2\" (UniqueName: \"kubernetes.io/projected/0b62db57-21d8-498f-9d27-8030bc510076-kube-api-access-l29t2\") pod \"machine-config-operator-74547568cd-926pr\" (UID: \"0b62db57-21d8-498f-9d27-8030bc510076\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.443841 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.450416 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.454179 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6v8\" (UniqueName: \"kubernetes.io/projected/4a70f5c2-aba5-46bb-a96b-da503d30e66e-kube-api-access-np6v8\") pod \"csi-hostpathplugin-npj9f\" (UID: \"4a70f5c2-aba5-46bb-a96b-da503d30e66e\") " pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.468643 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9ddp\" (UniqueName: \"kubernetes.io/projected/1c36a280-252a-48bd-a64d-be969429a43d-kube-api-access-q9ddp\") pod \"openshift-apiserver-operator-796bbdcf4f-zqrqq\" (UID: \"1c36a280-252a-48bd-a64d-be969429a43d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.490696 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.491640 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcjqm\" (UniqueName: \"kubernetes.io/projected/8efe62a3-ec31-4144-8d34-150502a96362-kube-api-access-xcjqm\") pod \"catalog-operator-68c6474976-fdthh\" (UID: \"8efe62a3-ec31-4144-8d34-150502a96362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.510591 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbsq\" (UniqueName: \"kubernetes.io/projected/82353b64-ab8c-431e-8732-ba585bd9cc95-kube-api-access-cvbsq\") pod \"kube-storage-version-migrator-operator-b67b599dd-tm88h\" (UID: \"82353b64-ab8c-431e-8732-ba585bd9cc95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.525111 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.525482 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.025463053 +0000 UTC m=+140.329221677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.543605 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gswnm\" (UniqueName: \"kubernetes.io/projected/49f94e97-89ed-41ca-b0c1-620d9e69ae81-kube-api-access-gswnm\") pod \"control-plane-machine-set-operator-78cbb6b69f-lwd6m\" (UID: \"49f94e97-89ed-41ca-b0c1-620d9e69ae81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.545275 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f0fecb-6dd3-4b45-9dfd-cdde8814bf48-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ft8qk\" (UID: \"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.568194 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.572074 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v7r4l"] Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.575404 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8bcm\" (UniqueName: \"kubernetes.io/projected/c9161413-554f-4d53-bc23-efd48ff91a94-kube-api-access-s8bcm\") pod \"collect-profiles-29409750-5ncx6\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.575928 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.579676 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.596418 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.604680 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.614000 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxfpg\" (UniqueName: \"kubernetes.io/projected/00283b92-e322-4923-8631-2f77c33b8993-kube-api-access-wxfpg\") pod \"machine-config-server-hxb77\" (UID: \"00283b92-e322-4923-8631-2f77c33b8993\") " pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.625939 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.626310 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l86t6\" (UniqueName: \"kubernetes.io/projected/0a304a57-2fa4-477c-8d57-4f411e4f8790-kube-api-access-l86t6\") pod \"machine-config-controller-84d6567774-std2v\" (UID: \"0a304a57-2fa4-477c-8d57-4f411e4f8790\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.626422 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.126407449 +0000 UTC m=+140.430166073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.627966 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hxb77" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.629367 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fqctr"] Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.634014 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgxql\" (UniqueName: \"kubernetes.io/projected/f1827035-d23f-4436-96ee-f363b9ea9022-kube-api-access-jgxql\") pod \"cluster-samples-operator-665b6dd947-xvpkl\" (UID: \"f1827035-d23f-4436-96ee-f363b9ea9022\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.650115 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjf9m\" (UniqueName: \"kubernetes.io/projected/a2762223-c499-429c-814a-00ead7b447d8-kube-api-access-rjf9m\") pod \"service-ca-9c57cc56f-472l6\" (UID: \"a2762223-c499-429c-814a-00ead7b447d8\") " pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.656106 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-npj9f" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.663774 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.689377 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z6wf\" (UniqueName: \"kubernetes.io/projected/e423ab17-2ba9-4b3a-8ff8-17c0addd9077-kube-api-access-7z6wf\") pod \"router-default-5444994796-lpmsm\" (UID: \"e423ab17-2ba9-4b3a-8ff8-17c0addd9077\") " pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.701547 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bz89h"] Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.718537 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnctx\" (UniqueName: \"kubernetes.io/projected/7886c492-0b69-4cb1-aef7-08e7e482bc6a-kube-api-access-hnctx\") pod \"marketplace-operator-79b997595-2xx98\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.720888 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbtr\" (UniqueName: \"kubernetes.io/projected/10e2d759-27e8-4e8f-8d6b-86817b091df5-kube-api-access-hdbtr\") pod \"service-ca-operator-777779d784-5xxsl\" (UID: \"10e2d759-27e8-4e8f-8d6b-86817b091df5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.727425 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.727992 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.227979563 +0000 UTC m=+140.531738187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.730231 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49c0687d-9489-4429-9ef2-09f82f7df268-bound-sa-token\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.735898 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl"] Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.752920 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tfh9j"] Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.756211 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.761280 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2f4q\" (UniqueName: \"kubernetes.io/projected/ddd9b688-a86b-4d31-b3f6-eb674a12d438-kube-api-access-w2f4q\") pod \"ingress-canary-zlgvt\" (UID: \"ddd9b688-a86b-4d31-b3f6-eb674a12d438\") " pod="openshift-ingress-canary/ingress-canary-zlgvt" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.768214 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.773054 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.775263 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flrxm\" (UniqueName: \"kubernetes.io/projected/e3ce8fad-1931-4034-96f9-6b9750665a36-kube-api-access-flrxm\") pod \"packageserver-d55dfcdfc-hxs9g\" (UID: \"e3ce8fad-1931-4034-96f9-6b9750665a36\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:20 crc kubenswrapper[4761]: W1201 10:33:20.782965 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f5c4fc_08a4_4683_ab53_e20612b27d02.slice/crio-6b84446ab1023db3abc8d774bf88d7aedff5d159473099ed48acbdd31d142deb WatchSource:0}: Error finding container 6b84446ab1023db3abc8d774bf88d7aedff5d159473099ed48acbdd31d142deb: Status 404 returned error can't find the container with id 6b84446ab1023db3abc8d774bf88d7aedff5d159473099ed48acbdd31d142deb Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.797865 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.805379 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mjfh\" (UniqueName: \"kubernetes.io/projected/373f0364-84dc-446c-87fa-bb03f4bf1baf-kube-api-access-4mjfh\") pod \"multus-admission-controller-857f4d67dd-q54z4\" (UID: \"373f0364-84dc-446c-87fa-bb03f4bf1baf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.811743 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqw44\" (UniqueName: \"kubernetes.io/projected/2248852d-ea7a-49eb-bbfc-c87aa7f6c597-kube-api-access-fqw44\") pod \"dns-default-xf5wg\" (UID: \"2248852d-ea7a-49eb-bbfc-c87aa7f6c597\") " pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.813336 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.829076 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.829233 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.329195597 +0000 UTC m=+140.632954221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.829369 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.829703 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.329693202 +0000 UTC m=+140.633451836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.829720 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/436b751c-ff5f-4b20-b63c-9960d2bebfb5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8vdgh\" (UID: \"436b751c-ff5f-4b20-b63c-9960d2bebfb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.836745 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.845082 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.850621 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbcnt\" (UniqueName: \"kubernetes.io/projected/8555dd96-901c-4ef4-b63b-816d54e1489b-kube-api-access-dbcnt\") pod \"migrator-59844c95c7-k46jr\" (UID: \"8555dd96-901c-4ef4-b63b-816d54e1489b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.863417 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-472l6" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.869025 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh4th\" (UniqueName: \"kubernetes.io/projected/0373f01a-1b29-45f1-a72b-f96dbfb5e359-kube-api-access-nh4th\") pod \"olm-operator-6b444d44fb-gqb5c\" (UID: \"0373f01a-1b29-45f1-a72b-f96dbfb5e359\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.887114 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.897866 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbrll\" (UniqueName: \"kubernetes.io/projected/49c0687d-9489-4429-9ef2-09f82f7df268-kube-api-access-gbrll\") pod \"ingress-operator-5b745b69d9-827fw\" (UID: \"49c0687d-9489-4429-9ef2-09f82f7df268\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.905822 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" event={"ID":"e735de1a-2c56-45b3-b091-33ef92a3b119","Type":"ContainerStarted","Data":"62362efece054c5418e6a5009886b36552c91d8ff41d8bb8245322eec5599d0a"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.909442 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxl5g\" (UniqueName: \"kubernetes.io/projected/6932484c-2cc4-42a6-816f-c368946e0a29-kube-api-access-fxl5g\") pod \"package-server-manager-789f6589d5-852hr\" (UID: \"6932484c-2cc4-42a6-816f-c368946e0a29\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.913223 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.914007 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" event={"ID":"3454eba8-593e-4647-8b4b-71e0f432ffeb","Type":"ContainerStarted","Data":"4995eb9e355990f0da28eb50783d815565145322d29e5c5702f73b47c3e578ad"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.919985 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.921656 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" event={"ID":"06629698-b5a8-41a6-b94b-771abc920e20","Type":"ContainerStarted","Data":"4d2813243afae49c5454f4c4b976ac28d143b737b1af094088b5f45606f9f6f5"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.924414 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hxb77" event={"ID":"00283b92-e322-4923-8631-2f77c33b8993","Type":"ContainerStarted","Data":"fda3c4ca64b6e650eb4e8c2421dce731976083273cdf1704fa2ba58b5ce28bf6"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.925250 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82k6m"] Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.929072 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" event={"ID":"33f5c4fc-08a4-4683-ab53-e20612b27d02","Type":"ContainerStarted","Data":"6b84446ab1023db3abc8d774bf88d7aedff5d159473099ed48acbdd31d142deb"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.930447 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.930585 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.430542285 +0000 UTC m=+140.734300929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.930996 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" event={"ID":"78cf6292-f923-40a8-9f4e-183d70e31a7f","Type":"ContainerStarted","Data":"0c87ae391015b86c0043e6fd50cd4f638388166c9c4ff8f777ea977af25fd6b7"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.931037 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" event={"ID":"78cf6292-f923-40a8-9f4e-183d70e31a7f","Type":"ContainerStarted","Data":"6f592df11c7da65c7019d632a5a6a8740c74ca354ff5c6b329f87301afa4ae00"} Dec 01 10:33:20 crc kubenswrapper[4761]: E1201 10:33:20.931347 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.431334239 +0000 UTC m=+140.735092863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.931635 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.931761 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fqctr" event={"ID":"7713e7f9-1a0c-448b-9814-c143fdd040ec","Type":"ContainerStarted","Data":"777a4b545346ce49520b2993894c06564288bf1c862117019686b581740a8866"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.936426 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bz89h" event={"ID":"0de6067f-4bc2-4265-bb7f-e595f6060033","Type":"ContainerStarted","Data":"25dd0a418bd826e1941fe13e18890390f49e636fea7017360171620af348b65d"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.937630 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zlgvt" Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.941947 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" event={"ID":"1328387a-a550-49b5-92ce-7019cb401bfb","Type":"ContainerStarted","Data":"931be8cbbd31227a45d3b80773a4d81697612f5592c3045a1dac89cd0acc6386"} Dec 01 10:33:20 crc kubenswrapper[4761]: I1201 10:33:20.966034 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.033128 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.033250 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.533230263 +0000 UTC m=+140.836988887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.033612 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.033887 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.533877682 +0000 UTC m=+140.837636306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.079780 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8nfjc"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.081307 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.105226 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.105632 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wqskh"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.120518 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.127362 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.134117 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.134508 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.634490898 +0000 UTC m=+140.938249522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.155579 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.235382 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.235774 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.735757064 +0000 UTC m=+141.039515688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.287685 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r4655"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.290228 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm"] Dec 01 10:33:21 crc kubenswrapper[4761]: W1201 10:33:21.312198 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa5c7a7_c270_4c62_b054_88a85fbfc8b9.slice/crio-7eb27d90c3d406e341795cae9460698bd6881e8bf3084ce96a4cbb2834bdbc05 WatchSource:0}: Error finding container 7eb27d90c3d406e341795cae9460698bd6881e8bf3084ce96a4cbb2834bdbc05: Status 404 returned error can't find the container with id 7eb27d90c3d406e341795cae9460698bd6881e8bf3084ce96a4cbb2834bdbc05 Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.336386 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.336785 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.836769302 +0000 UTC m=+141.140527926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: W1201 10:33:21.422054 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b1b0de_2b28_45c2_a655_fe3edb1c72d8.slice/crio-b45becd25debdceabb0b1578c0afef09c55bb5c5d5ac89f8922d28cda13d64be WatchSource:0}: Error finding container b45becd25debdceabb0b1578c0afef09c55bb5c5d5ac89f8922d28cda13d64be: Status 404 returned error can't find the container with id b45becd25debdceabb0b1578c0afef09c55bb5c5d5ac89f8922d28cda13d64be Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.423414 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-926pr"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.438931 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.439237 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:21.939225653 +0000 UTC m=+141.242984277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.446214 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.458269 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.540106 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.540356 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.040335164 +0000 UTC m=+141.344093788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.540496 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.540818 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.040811208 +0000 UTC m=+141.344569832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.610540 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.641015 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.641378 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.141363762 +0000 UTC m=+141.445122376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.745386 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.745723 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.245708579 +0000 UTC m=+141.549467213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.756844 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pgskl"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.760866 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-npj9f"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.796606 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xzg25"] Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.847658 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.847874 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.34783729 +0000 UTC m=+141.651595914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.847996 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.848328 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.348315595 +0000 UTC m=+141.652074219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.951721 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:21 crc kubenswrapper[4761]: E1201 10:33:21.952540 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.452523698 +0000 UTC m=+141.756282322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.958620 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r4655" event={"ID":"78b1b0de-2b28-45c2-a655-fe3edb1c72d8","Type":"ContainerStarted","Data":"b45becd25debdceabb0b1578c0afef09c55bb5c5d5ac89f8922d28cda13d64be"} Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.969789 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lpmsm" event={"ID":"e423ab17-2ba9-4b3a-8ff8-17c0addd9077","Type":"ContainerStarted","Data":"9ec8c4178a60c1eb46c02226d428ab11298f15165e795c8bbab5a59bc3601c2d"} Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.969847 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lpmsm" event={"ID":"e423ab17-2ba9-4b3a-8ff8-17c0addd9077","Type":"ContainerStarted","Data":"2521ee0c3d98d81ec092bedae4abd157532d695df65940a8046b084c71b60b2c"} Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.979718 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fqctr" event={"ID":"7713e7f9-1a0c-448b-9814-c143fdd040ec","Type":"ContainerStarted","Data":"382168acaf35f20a9509d9a2054fa69648f3cd46ecd7bcc2b2c5e61b5b3aa48d"} Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.981576 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-fqctr" Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.994197 4761 patch_prober.go:28] interesting pod/downloads-7954f5f757-fqctr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.994243 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fqctr" podUID="7713e7f9-1a0c-448b-9814-c143fdd040ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 01 10:33:21 crc kubenswrapper[4761]: I1201 10:33:21.995976 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hxb77" event={"ID":"00283b92-e322-4923-8631-2f77c33b8993","Type":"ContainerStarted","Data":"d8ece7c95d92dc0507591bb3a7889da5ddca16a4d9cfbdbd84ac0eeb4e939982"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.030707 4761 generic.go:334] "Generic (PLEG): container finished" podID="06629698-b5a8-41a6-b94b-771abc920e20" containerID="50751b085bad108aab8d31f320490050fa420aea596a19d52841060f9e6759d1" exitCode=0 Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.030794 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" event={"ID":"06629698-b5a8-41a6-b94b-771abc920e20","Type":"ContainerDied","Data":"50751b085bad108aab8d31f320490050fa420aea596a19d52841060f9e6759d1"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.050084 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" event={"ID":"e735de1a-2c56-45b3-b091-33ef92a3b119","Type":"ContainerStarted","Data":"35d8918c50b073dcd4fae302026e34588979f93213464d4935c60618a0eae681"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.053384 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.058293 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" event={"ID":"0b62db57-21d8-498f-9d27-8030bc510076","Type":"ContainerStarted","Data":"68cf32179683fbf3935dc70c622a1b684e657ccf1fe489abcb363ea3592b26f1"} Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.104042 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.604020134 +0000 UTC m=+141.907778758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.104208 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bz89h" event={"ID":"0de6067f-4bc2-4265-bb7f-e595f6060033","Type":"ContainerStarted","Data":"69b71c4373840bb03b9c09dcc7a6b0a0b552823da218fbefaffe0518361afd26"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.107536 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.112877 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" event={"ID":"873ee65e-5320-4949-8caa-893b41061408","Type":"ContainerStarted","Data":"c30c8365f2ff7d6cc04e51be8e5e3dc117a07ce71f6c49a1ded04bba2fe69d14"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.128028 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" event={"ID":"1328387a-a550-49b5-92ce-7019cb401bfb","Type":"ContainerStarted","Data":"794a1e7fb839ea7f9a95508f2f674a7d509b676c71e0c9e40610399548fa0671"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.129283 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" event={"ID":"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea","Type":"ContainerStarted","Data":"67fdfaf4aad1f5c4acd0d825f8d15ab872b2ef9d66f20177ad647f47af15c17e"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.133071 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" event={"ID":"49f94e97-89ed-41ca-b0c1-620d9e69ae81","Type":"ContainerStarted","Data":"a2d167ff078296bea33b36f86cbfc3f8560737672b073a7130bac78494d971d5"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.136039 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" event={"ID":"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9","Type":"ContainerStarted","Data":"7eb27d90c3d406e341795cae9460698bd6881e8bf3084ce96a4cbb2834bdbc05"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.145959 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" event={"ID":"f2e7458e-68c1-4b57-a6f5-43eed3453e64","Type":"ContainerStarted","Data":"7ad9ceb10a5e9f0c6ad95ca812e95fb71e49dea053f4fb8938c99900690db745"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.147045 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" event={"ID":"c9161413-554f-4d53-bc23-efd48ff91a94","Type":"ContainerStarted","Data":"0498acb76d625d33b28f6fb5d99b4e3b4baea54db9534f577751d76dafb2c123"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.154237 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.154363 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.654329427 +0000 UTC m=+141.958088051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.154702 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.155184 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.655175952 +0000 UTC m=+141.958934566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.159015 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" event={"ID":"33f5c4fc-08a4-4683-ab53-e20612b27d02","Type":"ContainerStarted","Data":"af537995f5c7635f7dc3bb4959993d721813c4ebd32be939fb412f5f958aa966"} Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.159062 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.256367 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.258774 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.758753716 +0000 UTC m=+142.062512340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.272489 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.366492 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.373636 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.373893 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.873880963 +0000 UTC m=+142.177639577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.388708 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.474924 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.476028 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:22.975993144 +0000 UTC m=+142.279751768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.523159 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd"] Dec 01 10:33:22 crc kubenswrapper[4761]: W1201 10:33:22.559434 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65e456cc_7c02_479e_a278_af630a5dfd6f.slice/crio-1428c05eadf3a703089f121e4da4715cca5d3716e800e41fc7c034d5be691c0b WatchSource:0}: Error finding container 1428c05eadf3a703089f121e4da4715cca5d3716e800e41fc7c034d5be691c0b: Status 404 returned error can't find the container with id 1428c05eadf3a703089f121e4da4715cca5d3716e800e41fc7c034d5be691c0b Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.578406 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.578704 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.078693252 +0000 UTC m=+142.382451876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.610059 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" podStartSLOduration=121.610040682 podStartE2EDuration="2m1.610040682s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.609263899 +0000 UTC m=+141.913022513" watchObservedRunningTime="2025-12-01 10:33:22.610040682 +0000 UTC m=+141.913799306" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.629660 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hxb77" podStartSLOduration=5.629639054 podStartE2EDuration="5.629639054s" podCreationTimestamp="2025-12-01 10:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.628197031 +0000 UTC m=+141.931955655" watchObservedRunningTime="2025-12-01 10:33:22.629639054 +0000 UTC m=+141.933397678" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.653376 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.653423 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.677178 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-472l6"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.682707 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.682802 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.182777131 +0000 UTC m=+142.486535755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.682930 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.683238 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.183227715 +0000 UTC m=+142.486986339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: W1201 10:33:22.699527 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2762223_c499_429c_814a_00ead7b447d8.slice/crio-24a22e1e319797de83ad70e6201cb18dc876e44a643c5f32da6939b00f21fcb6 WatchSource:0}: Error finding container 24a22e1e319797de83ad70e6201cb18dc876e44a643c5f32da6939b00f21fcb6: Status 404 returned error can't find the container with id 24a22e1e319797de83ad70e6201cb18dc876e44a643c5f32da6939b00f21fcb6 Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.704096 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2xx98"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.709809 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-std2v"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.714847 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lpmsm" podStartSLOduration=121.714831333 podStartE2EDuration="2m1.714831333s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.710124443 +0000 UTC m=+142.013883067" watchObservedRunningTime="2025-12-01 10:33:22.714831333 +0000 UTC m=+142.018589947" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.717758 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zlgvt"] Dec 01 10:33:22 crc kubenswrapper[4761]: W1201 10:33:22.724919 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ce8fad_1931_4034_96f9_6b9750665a36.slice/crio-72fae3e12f85c9c7540ff3d5021b2ea58efb6ad812eee43218eeb3066e0e5333 WatchSource:0}: Error finding container 72fae3e12f85c9c7540ff3d5021b2ea58efb6ad812eee43218eeb3066e0e5333: Status 404 returned error can't find the container with id 72fae3e12f85c9c7540ff3d5021b2ea58efb6ad812eee43218eeb3066e0e5333 Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.749594 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.749940 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.752903 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9tst" podStartSLOduration=122.752886552 podStartE2EDuration="2m2.752886552s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.746673428 +0000 UTC m=+142.050432052" watchObservedRunningTime="2025-12-01 10:33:22.752886552 +0000 UTC m=+142.056645166" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.754930 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-827fw"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.774266 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.782698 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.783994 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-q54z4"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.784059 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.784375 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.284358316 +0000 UTC m=+142.588116940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.790871 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:22 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:22 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:22 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.790921 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.793104 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.794797 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.802088 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-fqctr" podStartSLOduration=121.802068832 podStartE2EDuration="2m1.802068832s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.798388433 +0000 UTC m=+142.102147047" watchObservedRunningTime="2025-12-01 10:33:22.802068832 +0000 UTC m=+142.105827456" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.809682 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xf5wg"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.822890 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl"] Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.834195 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gxtn" podStartSLOduration=121.834176475 podStartE2EDuration="2m1.834176475s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.830142805 +0000 UTC m=+142.133901429" watchObservedRunningTime="2025-12-01 10:33:22.834176475 +0000 UTC m=+142.137935099" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.875220 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7r4l" podStartSLOduration=122.875198682 podStartE2EDuration="2m2.875198682s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.871950776 +0000 UTC m=+142.175709400" watchObservedRunningTime="2025-12-01 10:33:22.875198682 +0000 UTC m=+142.178957316" Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.899829 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:22 crc kubenswrapper[4761]: E1201 10:33:22.902452 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.402431981 +0000 UTC m=+142.706190605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:22 crc kubenswrapper[4761]: I1201 10:33:22.927051 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-bz89h" podStartSLOduration=121.927033141 podStartE2EDuration="2m1.927033141s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.926961119 +0000 UTC m=+142.230719733" watchObservedRunningTime="2025-12-01 10:33:22.927033141 +0000 UTC m=+142.230791775" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.001494 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.001869 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.501837221 +0000 UTC m=+142.805595845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.103070 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.103772 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.603756646 +0000 UTC m=+142.907515270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.204970 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" event={"ID":"5085aee7-8987-489e-86af-3c11f1a6618d","Type":"ContainerStarted","Data":"421bc5bd9f63eab21595c9f0622b9daeee9fa8fb17105238ad2bdf8a4c3c7847"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.205015 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" event={"ID":"5085aee7-8987-489e-86af-3c11f1a6618d","Type":"ContainerStarted","Data":"51d60ea500abbd850961fdc4376795e9ca69c28564441320815e2be3e5d67ee0"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.206112 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.206278 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.706256798 +0000 UTC m=+143.010015422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.206417 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.206665 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" event={"ID":"0b62db57-21d8-498f-9d27-8030bc510076","Type":"ContainerStarted","Data":"f78e6b135013b76e52496fb76b976200cb2d6346694fa9017ef23a2a68f8e0e1"} Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.206677 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.70666705 +0000 UTC m=+143.010425664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.208705 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" event={"ID":"8555dd96-901c-4ef4-b63b-816d54e1489b","Type":"ContainerStarted","Data":"7db40a31b49798212d0956ade6d1af813d7c16e54512656d0acd0801c6150920"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.214584 4761 generic.go:334] "Generic (PLEG): container finished" podID="3454eba8-593e-4647-8b4b-71e0f432ffeb" containerID="08114d3c008d2925920ac683d21f0e6279aee092bddca17b326c2aed5c35d78b" exitCode=0 Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.214644 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" event={"ID":"3454eba8-593e-4647-8b4b-71e0f432ffeb","Type":"ContainerDied","Data":"08114d3c008d2925920ac683d21f0e6279aee092bddca17b326c2aed5c35d78b"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.215988 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r4655" event={"ID":"78b1b0de-2b28-45c2-a655-fe3edb1c72d8","Type":"ContainerStarted","Data":"3826cc2ef0f36f5b24a404825b55726b8fd08f3e200f51d51dafe3c8d343815b"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.216405 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.220839 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" event={"ID":"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9","Type":"ContainerStarted","Data":"e3707f814cdceeec2b951d9945175d94fca1f85b17c2182f7139ca0b55aef957"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.225503 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" event={"ID":"49c0687d-9489-4429-9ef2-09f82f7df268","Type":"ContainerStarted","Data":"b68034c7b734f4f7cc0cbc32cb4d70419e853c9053d399ab4c1f90dac41bef42"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.226592 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" event={"ID":"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48","Type":"ContainerStarted","Data":"6f2f7c369ae638f961aa248101b8240a9fe45657c4219f357465a1d841e059a4"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.231382 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" event={"ID":"49f94e97-89ed-41ca-b0c1-620d9e69ae81","Type":"ContainerStarted","Data":"1ad136e6c523c6100b02f2e0563b94b2246cd555c749bdf70d786bb222a14ad3"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.243168 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" event={"ID":"6932484c-2cc4-42a6-816f-c368946e0a29","Type":"ContainerStarted","Data":"5be20d0347b5ac2cf3bed1eeea8979fe36a6d9aae70fea5b37beb5797bc3627b"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.251853 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zlgvt" event={"ID":"ddd9b688-a86b-4d31-b3f6-eb674a12d438","Type":"ContainerStarted","Data":"53f29921aeb86cfe7ccf0000e7533d86d3be86e683b8e11565a640a53d0df3e9"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.254889 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" event={"ID":"e3ce8fad-1931-4034-96f9-6b9750665a36","Type":"ContainerStarted","Data":"72fae3e12f85c9c7540ff3d5021b2ea58efb6ad812eee43218eeb3066e0e5333"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.258036 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" event={"ID":"06629698-b5a8-41a6-b94b-771abc920e20","Type":"ContainerStarted","Data":"dfeaedfac5825e5259ef1267604278183ce0b2ffdf39fd631ba7f435b834f6ea"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.259797 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" event={"ID":"c9161413-554f-4d53-bc23-efd48ff91a94","Type":"ContainerStarted","Data":"37c31277c639225a9bba38eb4fd8b24da58a3083e9c796419344734ffda93868"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.263542 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xf5wg" event={"ID":"2248852d-ea7a-49eb-bbfc-c87aa7f6c597","Type":"ContainerStarted","Data":"125ea914a01b85a1ce5441d211c35b7dfdd6a0051fb8306e5eba150f16f6bcb9"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.274139 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" event={"ID":"2428d96d-0ab2-45e1-8bd1-9bfdbd5dfdea","Type":"ContainerStarted","Data":"99ac7e1ad720d5f2d2feee766adb15e5ad60ede41ccaa120ca0d5fee975f8fb1"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.277305 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" event={"ID":"0373f01a-1b29-45f1-a72b-f96dbfb5e359","Type":"ContainerStarted","Data":"a5bd1e3a86ebd196ade56d306021217b2206a33c23e26c77727f0a1006a486ed"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.278047 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" event={"ID":"65e456cc-7c02-479e-a278-af630a5dfd6f","Type":"ContainerStarted","Data":"1428c05eadf3a703089f121e4da4715cca5d3716e800e41fc7c034d5be691c0b"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.280638 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" event={"ID":"1c36a280-252a-48bd-a64d-be969429a43d","Type":"ContainerStarted","Data":"dacda50987300a7c8a6534e9b2f36e2df636daa4463388fb3612e50ca7394b8f"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.286899 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" event={"ID":"0a304a57-2fa4-477c-8d57-4f411e4f8790","Type":"ContainerStarted","Data":"e822a0fcc39f9f4a3a4bcd9d3b96b3c1b2d65443a876d034d9e0c99b758af9a7"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.291649 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" event={"ID":"373f0364-84dc-446c-87fa-bb03f4bf1baf","Type":"ContainerStarted","Data":"9fa3e5f8ca354b32dd29de6a2f24d069a3daf67f9c86c05df3405ea80dc5a415"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.306945 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.307136 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.80709894 +0000 UTC m=+143.110857564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.311166 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.316643 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.816630233 +0000 UTC m=+143.120388857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.316783 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" event={"ID":"f2e7458e-68c1-4b57-a6f5-43eed3453e64","Type":"ContainerStarted","Data":"a4630611881fa0fcf33867589ef20e2e472b31e979d703ec4a185d65b2870a5c"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.319222 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" event={"ID":"52d03758-6fb1-4040-ae86-d2a89d6cc88f","Type":"ContainerStarted","Data":"6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.319246 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" event={"ID":"52d03758-6fb1-4040-ae86-d2a89d6cc88f","Type":"ContainerStarted","Data":"6291c97bcd9c6acb8976631188cfcb7f77f0d1f37ca4ee1c536b97964249160a"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.320203 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.325030 4761 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pgskl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.325077 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" podUID="52d03758-6fb1-4040-ae86-d2a89d6cc88f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.325403 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" event={"ID":"436b751c-ff5f-4b20-b63c-9960d2bebfb5","Type":"ContainerStarted","Data":"0b43b289a8c6804844f1f295926c019dfccf720a5522d5dd201e27b3bad8e275"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.365151 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-472l6" event={"ID":"a2762223-c499-429c-814a-00ead7b447d8","Type":"ContainerStarted","Data":"24a22e1e319797de83ad70e6201cb18dc876e44a643c5f32da6939b00f21fcb6"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.407938 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" event={"ID":"8efe62a3-ec31-4144-8d34-150502a96362","Type":"ContainerStarted","Data":"85e47dde54659b0b259baf7f0dca680b2e2a3dc7b35308b125948b4a392cb508"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.408266 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" event={"ID":"8efe62a3-ec31-4144-8d34-150502a96362","Type":"ContainerStarted","Data":"467fe2cde42ffba22a7dcf15064dbddaf32f7805be77e522fad2f81948dfd2a9"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.408836 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.417394 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" event={"ID":"82353b64-ab8c-431e-8732-ba585bd9cc95","Type":"ContainerStarted","Data":"53fa5bb11e74bba263cfcfe3a9f684159a2b78736fb3f6e43e211760cfa86a81"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.426649 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.427084 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.927059401 +0000 UTC m=+143.230818065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.427198 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.428245 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:23.928231535 +0000 UTC m=+143.231990159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.430631 4761 generic.go:334] "Generic (PLEG): container finished" podID="906c8f55-3191-4b35-a7d2-80fd512d5c34" containerID="ebf48c5d568337aeb4bbc4289a60483a609ddc76176d3f2b8ab81e022948d7e1" exitCode=0 Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.432666 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" event={"ID":"906c8f55-3191-4b35-a7d2-80fd512d5c34","Type":"ContainerDied","Data":"ebf48c5d568337aeb4bbc4289a60483a609ddc76176d3f2b8ab81e022948d7e1"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.432731 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" event={"ID":"906c8f55-3191-4b35-a7d2-80fd512d5c34","Type":"ContainerStarted","Data":"9ca43ee9cf3e6a577c9ba963413aa632b62518abeb3d26f93c8a31f6a62aa20a"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.442938 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.450674 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" event={"ID":"873ee65e-5320-4949-8caa-893b41061408","Type":"ContainerStarted","Data":"0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.451598 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.453849 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" event={"ID":"10e2d759-27e8-4e8f-8d6b-86817b091df5","Type":"ContainerStarted","Data":"aaa98b2c6fb1c9b33fa9abbfaffe7d30ead57a294d58e05bc158643690ca934d"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.481850 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-npj9f" event={"ID":"4a70f5c2-aba5-46bb-a96b-da503d30e66e","Type":"ContainerStarted","Data":"4c21f70c5a88bb67da71458ed2e4dd69fb08c59519782af25f1706fa538028d9"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.481888 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-npj9f" event={"ID":"4a70f5c2-aba5-46bb-a96b-da503d30e66e","Type":"ContainerStarted","Data":"b13f06cd8e5ebed8d3e874a0783e77119fd82c12201eb744808e803c65fa34ab"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.489580 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" event={"ID":"7886c492-0b69-4cb1-aef7-08e7e482bc6a","Type":"ContainerStarted","Data":"8086c75beb2d4d1c763b580c8b1e5e41fe96f1fba3767e3e4325a29e489c508b"} Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.490508 4761 patch_prober.go:28] interesting pod/downloads-7954f5f757-fqctr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.490547 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fqctr" podUID="7713e7f9-1a0c-448b-9814-c143fdd040ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.528406 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.529314 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.029295545 +0000 UTC m=+143.333054169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.631359 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.631912 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.13190162 +0000 UTC m=+143.435660244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.732863 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.733018 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.233000761 +0000 UTC m=+143.536759385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.733943 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.734253 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.234245298 +0000 UTC m=+143.538003922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.777388 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:23 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:23 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:23 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.777434 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.828047 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.835309 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.835466 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.335439161 +0000 UTC m=+143.639197795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.835865 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.836360 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.336336008 +0000 UTC m=+143.640094682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.884767 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rdbjm" podStartSLOduration=122.884747385 podStartE2EDuration="2m2.884747385s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:23.883672003 +0000 UTC m=+143.187430637" watchObservedRunningTime="2025-12-01 10:33:23.884747385 +0000 UTC m=+143.188506009" Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.941428 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:23 crc kubenswrapper[4761]: E1201 10:33:23.941774 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.441760717 +0000 UTC m=+143.745519341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:23 crc kubenswrapper[4761]: I1201 10:33:23.983057 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-r4655" podStartSLOduration=123.983037152 podStartE2EDuration="2m3.983037152s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:23.982011111 +0000 UTC m=+143.285769735" watchObservedRunningTime="2025-12-01 10:33:23.983037152 +0000 UTC m=+143.286795776" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.006497 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" podStartSLOduration=124.006480278 podStartE2EDuration="2m4.006480278s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.004767087 +0000 UTC m=+143.308525731" watchObservedRunningTime="2025-12-01 10:33:24.006480278 +0000 UTC m=+143.310238902" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.012209 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-r4655" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.043826 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.044281 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.544270659 +0000 UTC m=+143.848029283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.060498 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fdthh" podStartSLOduration=123.060480971 podStartE2EDuration="2m3.060480971s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.057665777 +0000 UTC m=+143.361424391" watchObservedRunningTime="2025-12-01 10:33:24.060480971 +0000 UTC m=+143.364239595" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.144741 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.145125 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.645106422 +0000 UTC m=+143.948865046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.193322 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" podStartSLOduration=123.193306433 podStartE2EDuration="2m3.193306433s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.190963493 +0000 UTC m=+143.494722117" watchObservedRunningTime="2025-12-01 10:33:24.193306433 +0000 UTC m=+143.497065057" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.247669 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.248108 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.748092299 +0000 UTC m=+144.051850923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.255408 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" podStartSLOduration=124.255390895 podStartE2EDuration="2m4.255390895s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.249901762 +0000 UTC m=+143.553660386" watchObservedRunningTime="2025-12-01 10:33:24.255390895 +0000 UTC m=+143.559149519" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.283917 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" podStartSLOduration=123.283902332 podStartE2EDuration="2m3.283902332s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.282968204 +0000 UTC m=+143.586726828" watchObservedRunningTime="2025-12-01 10:33:24.283902332 +0000 UTC m=+143.587660956" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.348667 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.349486 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.849463807 +0000 UTC m=+144.153222441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.355021 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-wqskh" podStartSLOduration=123.355004042 podStartE2EDuration="2m3.355004042s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.353350013 +0000 UTC m=+143.657108637" watchObservedRunningTime="2025-12-01 10:33:24.355004042 +0000 UTC m=+143.658762666" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.356103 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lwd6m" podStartSLOduration=123.356097344 podStartE2EDuration="2m3.356097344s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.328630719 +0000 UTC m=+143.632389353" watchObservedRunningTime="2025-12-01 10:33:24.356097344 +0000 UTC m=+143.659855968" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.451354 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.451749 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:24.951731913 +0000 UTC m=+144.255490607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.504198 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" event={"ID":"0a304a57-2fa4-477c-8d57-4f411e4f8790","Type":"ContainerStarted","Data":"989a60d148c58917106b86b6d552af6ed801191df744c4916a21f5000c5567f2"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.513234 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" event={"ID":"6932484c-2cc4-42a6-816f-c368946e0a29","Type":"ContainerStarted","Data":"f28182dea26ef967557447501f2ebc42b5fd0fd98b001b2535639fbff221ce82"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.517197 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" event={"ID":"82353b64-ab8c-431e-8732-ba585bd9cc95","Type":"ContainerStarted","Data":"50f6eda11cf5a9544483b8218d64efb831f021b7b3890e33e50e20e25ed8b64a"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.520114 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" event={"ID":"5085aee7-8987-489e-86af-3c11f1a6618d","Type":"ContainerStarted","Data":"efde2c5c3236a4d8262eb4d1f211033a3c24217402b204b831c2f0904f0e565b"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.523110 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" event={"ID":"0b62db57-21d8-498f-9d27-8030bc510076","Type":"ContainerStarted","Data":"9531e962767e5d57ad80d855d264d79def04cba227898377d7c869fd5201ce93"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.529598 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" event={"ID":"1aa5c7a7-c270-4c62-b054-88a85fbfc8b9","Type":"ContainerStarted","Data":"7cfa90d7c157c0decde8393a9203a6b4f618c53c3c4a8e820a5835e9f9945a5c"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.534159 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tm88h" podStartSLOduration=123.534141429 podStartE2EDuration="2m3.534141429s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.53114993 +0000 UTC m=+143.834908564" watchObservedRunningTime="2025-12-01 10:33:24.534141429 +0000 UTC m=+143.837900053" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.535834 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zlgvt" event={"ID":"ddd9b688-a86b-4d31-b3f6-eb674a12d438","Type":"ContainerStarted","Data":"7c23e6d15e0b8a56fcf606bb49e62fcad15cc6ecbb8e7edb1567bfa253af2e99"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.548310 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-926pr" podStartSLOduration=123.548288679 podStartE2EDuration="2m3.548288679s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.548204266 +0000 UTC m=+143.851962890" watchObservedRunningTime="2025-12-01 10:33:24.548288679 +0000 UTC m=+143.852047303" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.549742 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" event={"ID":"60f0fecb-6dd3-4b45-9dfd-cdde8814bf48","Type":"ContainerStarted","Data":"986a8347fda509ef848d40ccef9889c446987e5b1d24c960c2a05877283a8492"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.552446 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.553011 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.052988418 +0000 UTC m=+144.356747042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.559436 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" event={"ID":"8555dd96-901c-4ef4-b63b-816d54e1489b","Type":"ContainerStarted","Data":"4b99b7ba092753d207e46cf9fb72dcd09b1f0e6fc8e298e3e26e3a1739bb762a"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.568175 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8nfjc" podStartSLOduration=123.568153678 podStartE2EDuration="2m3.568153678s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.564720816 +0000 UTC m=+143.868479460" watchObservedRunningTime="2025-12-01 10:33:24.568153678 +0000 UTC m=+143.871912302" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.589104 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-xzg25" podStartSLOduration=123.5890871 podStartE2EDuration="2m3.5890871s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.588204463 +0000 UTC m=+143.891963087" watchObservedRunningTime="2025-12-01 10:33:24.5890871 +0000 UTC m=+143.892845724" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.628133 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ft8qk" podStartSLOduration=123.628110048 podStartE2EDuration="2m3.628110048s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.627924882 +0000 UTC m=+143.931683506" watchObservedRunningTime="2025-12-01 10:33:24.628110048 +0000 UTC m=+143.931868672" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.638371 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" event={"ID":"3454eba8-593e-4647-8b4b-71e0f432ffeb","Type":"ContainerStarted","Data":"c7b2f0c82c7324e29d70368e13d3fc5cae97b10ccfd8da986ebd9c2ff47d44ae"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.653500 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.654874 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.154862392 +0000 UTC m=+144.458621016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.666704 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zlgvt" podStartSLOduration=7.666684773 podStartE2EDuration="7.666684773s" podCreationTimestamp="2025-12-01 10:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.665346333 +0000 UTC m=+143.969104947" watchObservedRunningTime="2025-12-01 10:33:24.666684773 +0000 UTC m=+143.970443397" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.678471 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" event={"ID":"436b751c-ff5f-4b20-b63c-9960d2bebfb5","Type":"ContainerStarted","Data":"51e9767609b9460ecc789e4493164d5069c9a2194279abb79839859ab2b7cc4e"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.708328 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" event={"ID":"0373f01a-1b29-45f1-a72b-f96dbfb5e359","Type":"ContainerStarted","Data":"9e2103348241306825bd63dbea3c00d9b83764caf591096ae45769cf560efe1a"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.709253 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.719461 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pgskl"] Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.743814 4761 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-gqb5c container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.743860 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" podUID="0373f01a-1b29-45f1-a72b-f96dbfb5e359" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.768522 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.768843 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8vdgh" podStartSLOduration=123.768832384 podStartE2EDuration="2m3.768832384s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.768087292 +0000 UTC m=+144.071845916" watchObservedRunningTime="2025-12-01 10:33:24.768832384 +0000 UTC m=+144.072591008" Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.769630 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.269618548 +0000 UTC m=+144.573377172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.788711 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:24 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:24 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:24 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.788763 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.796702 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-472l6" event={"ID":"a2762223-c499-429c-814a-00ead7b447d8","Type":"ContainerStarted","Data":"5d0df86220fe9997b9fd593aeca977df0882e0f5383188aa276bbe81d231c91a"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.847417 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" event={"ID":"10e2d759-27e8-4e8f-8d6b-86817b091df5","Type":"ContainerStarted","Data":"d6aff417f4e754037ca8a60190fef9aa9a0d96edd08d99a5ca222039d3b96cb9"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.869894 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.870646 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.370627066 +0000 UTC m=+144.674385690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.878951 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" event={"ID":"7886c492-0b69-4cb1-aef7-08e7e482bc6a","Type":"ContainerStarted","Data":"0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.880143 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.895951 4761 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2xx98 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.895997 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.896346 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" event={"ID":"65e456cc-7c02-479e-a278-af630a5dfd6f","Type":"ContainerStarted","Data":"4f460d62875072b546d0789dd360a00edd01f06bb0fc19b0c286192993fd3e92"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.902252 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" podStartSLOduration=123.902232634 podStartE2EDuration="2m3.902232634s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.84650593 +0000 UTC m=+144.150264554" watchObservedRunningTime="2025-12-01 10:33:24.902232634 +0000 UTC m=+144.205991248" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.913458 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" event={"ID":"373f0364-84dc-446c-87fa-bb03f4bf1baf","Type":"ContainerStarted","Data":"cd21e476bac0dbf30ae637e61a36c370afe4b887c3e9555e01bc41b091695e4f"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.921702 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.921734 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.928814 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xf5wg" event={"ID":"2248852d-ea7a-49eb-bbfc-c87aa7f6c597","Type":"ContainerStarted","Data":"a094579860d40c937e564b61bf64eba0658b2355bf6f31f11b17dd4d6afaf359"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.936798 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" event={"ID":"e3ce8fad-1931-4034-96f9-6b9750665a36","Type":"ContainerStarted","Data":"6c2a22d2f52afb456e8a8795590f6487804f8705208e4034221ddf30aff119c3"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.937659 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.941722 4761 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hxs9g container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.941768 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" podUID="e3ce8fad-1931-4034-96f9-6b9750665a36" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.952005 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" event={"ID":"906c8f55-3191-4b35-a7d2-80fd512d5c34","Type":"ContainerStarted","Data":"d93d0665a03b83a7310b8252754acaf8f15994fc4ed64dc0d462cb4f84681611"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.952504 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.954202 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-472l6" podStartSLOduration=123.954183196 podStartE2EDuration="2m3.954183196s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.911598552 +0000 UTC m=+144.215357176" watchObservedRunningTime="2025-12-01 10:33:24.954183196 +0000 UTC m=+144.257941820" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.954866 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5xxsl" podStartSLOduration=123.954861956 podStartE2EDuration="2m3.954861956s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.952704592 +0000 UTC m=+144.256463216" watchObservedRunningTime="2025-12-01 10:33:24.954861956 +0000 UTC m=+144.258620580" Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.972544 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" event={"ID":"f1827035-d23f-4436-96ee-f363b9ea9022","Type":"ContainerStarted","Data":"6af6b555ec1130d25879e7f5fee9703a63b0f74b06d6159d640deb283ec2b1bc"} Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.973090 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:24 crc kubenswrapper[4761]: E1201 10:33:24.973844 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.473829719 +0000 UTC m=+144.777588343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:24 crc kubenswrapper[4761]: I1201 10:33:24.978227 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" podStartSLOduration=123.978210999 podStartE2EDuration="2m3.978210999s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.975380715 +0000 UTC m=+144.279139339" watchObservedRunningTime="2025-12-01 10:33:24.978210999 +0000 UTC m=+144.281969623" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.000247 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" event={"ID":"49c0687d-9489-4429-9ef2-09f82f7df268","Type":"ContainerStarted","Data":"af8ff5051c5c4f6ea0bf59e67b2ec0324adc7ef97b38385cce3c2e107621ecb5"} Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.001138 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kjxwd" podStartSLOduration=124.001127509 podStartE2EDuration="2m4.001127509s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:24.999941904 +0000 UTC m=+144.303700528" watchObservedRunningTime="2025-12-01 10:33:25.001127509 +0000 UTC m=+144.304886133" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.012749 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" event={"ID":"1c36a280-252a-48bd-a64d-be969429a43d","Type":"ContainerStarted","Data":"35f6c084b363dec2bd31e475d22d02e79f4b4d8c1bccb1337f6f7fa7c5c86c48"} Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.020858 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.050945 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" podStartSLOduration=125.050930457 podStartE2EDuration="2m5.050930457s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:25.024875734 +0000 UTC m=+144.328634378" watchObservedRunningTime="2025-12-01 10:33:25.050930457 +0000 UTC m=+144.354689081" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.051138 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" podStartSLOduration=124.051134833 podStartE2EDuration="2m4.051134833s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:25.048924237 +0000 UTC m=+144.352682861" watchObservedRunningTime="2025-12-01 10:33:25.051134833 +0000 UTC m=+144.354893457" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.076822 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.077259 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.577242198 +0000 UTC m=+144.881000822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.080936 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zqrqq" podStartSLOduration=125.080920577 podStartE2EDuration="2m5.080920577s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:25.078892467 +0000 UTC m=+144.382651091" watchObservedRunningTime="2025-12-01 10:33:25.080920577 +0000 UTC m=+144.384679191" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.178455 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.178633 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.678604076 +0000 UTC m=+144.982362700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.178714 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.179064 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.679048849 +0000 UTC m=+144.982807463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.278450 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" podStartSLOduration=124.278431249 podStartE2EDuration="2m4.278431249s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:25.112612058 +0000 UTC m=+144.416370682" watchObservedRunningTime="2025-12-01 10:33:25.278431249 +0000 UTC m=+144.582189873" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.279461 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.279687 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.779654536 +0000 UTC m=+145.083413160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.279796 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.280118 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.780110729 +0000 UTC m=+145.083869353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.280735 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d99sk"] Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.281572 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.283672 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.330510 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d99sk"] Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.381101 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.381289 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.881261781 +0000 UTC m=+145.185020405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.381423 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-catalog-content\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.381490 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-utilities\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.381658 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42248\" (UniqueName: \"kubernetes.io/projected/1e69dab2-4c11-4352-95c8-92499a4c5a75-kube-api-access-42248\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.381689 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.381986 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.881972572 +0000 UTC m=+145.185731196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.481464 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nztd7"] Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.482223 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.482467 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.982450565 +0000 UTC m=+145.286209179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.482571 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.482581 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-utilities\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.482709 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42248\" (UniqueName: \"kubernetes.io/projected/1e69dab2-4c11-4352-95c8-92499a4c5a75-kube-api-access-42248\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.482744 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.482867 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-catalog-content\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.483136 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:25.983128805 +0000 UTC m=+145.286887429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.483621 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-catalog-content\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.483655 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-utilities\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.485768 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.496151 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nztd7"] Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.551499 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42248\" (UniqueName: \"kubernetes.io/projected/1e69dab2-4c11-4352-95c8-92499a4c5a75-kube-api-access-42248\") pod \"community-operators-d99sk\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.583658 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.583802 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-utilities\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.583844 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.083818363 +0000 UTC m=+145.387576987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.583948 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.583981 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktwjw\" (UniqueName: \"kubernetes.io/projected/3fe88ace-f487-4b05-a9de-d5bdd2945c75-kube-api-access-ktwjw\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.584063 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-catalog-content\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.584219 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.084204735 +0000 UTC m=+145.387963359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.594711 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.679209 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5c886"] Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.680572 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.684947 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.685184 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.185160771 +0000 UTC m=+145.488919395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.685273 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktwjw\" (UniqueName: \"kubernetes.io/projected/3fe88ace-f487-4b05-a9de-d5bdd2945c75-kube-api-access-ktwjw\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.685311 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.685375 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-catalog-content\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.685543 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-utilities\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.685992 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-utilities\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.686026 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.186014186 +0000 UTC m=+145.489772810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.686209 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-catalog-content\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.693390 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5c886"] Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.723047 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktwjw\" (UniqueName: \"kubernetes.io/projected/3fe88ace-f487-4b05-a9de-d5bdd2945c75-kube-api-access-ktwjw\") pod \"certified-operators-nztd7\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.781760 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:25 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:25 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:25 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.781806 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.788218 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.788422 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8bg\" (UniqueName: \"kubernetes.io/projected/5821e59d-de93-43fd-822d-83128ce780de-kube-api-access-8x8bg\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.788443 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-catalog-content\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.788472 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-utilities\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.788601 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.288586851 +0000 UTC m=+145.592345475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.851498 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.890592 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8bg\" (UniqueName: \"kubernetes.io/projected/5821e59d-de93-43fd-822d-83128ce780de-kube-api-access-8x8bg\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.890631 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-catalog-content\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.890650 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.890675 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-utilities\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.891124 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-utilities\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.891582 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-catalog-content\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.891798 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.391786344 +0000 UTC m=+145.695544968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.891814 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qh6dn"] Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.892718 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.924107 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qh6dn"] Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.932834 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8bg\" (UniqueName: \"kubernetes.io/projected/5821e59d-de93-43fd-822d-83128ce780de-kube-api-access-8x8bg\") pod \"community-operators-5c886\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.984576 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.994150 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.994389 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-catalog-content\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.994422 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jl5\" (UniqueName: \"kubernetes.io/projected/a570f753-345e-40b8-a088-2d28ecf41896-kube-api-access-v2jl5\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.994508 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-utilities\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:25 crc kubenswrapper[4761]: I1201 10:33:25.994665 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5c886" Dec 01 10:33:25 crc kubenswrapper[4761]: E1201 10:33:25.995085 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.495064579 +0000 UTC m=+145.798823203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.060353 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" event={"ID":"6932484c-2cc4-42a6-816f-c368946e0a29","Type":"ContainerStarted","Data":"92cf2e304a6c523392b1d940684c9a5af9a56a1aeda075032a5552f72668af83"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.060863 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.061415 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d99sk"] Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.067265 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" event={"ID":"373f0364-84dc-446c-87fa-bb03f4bf1baf","Type":"ContainerStarted","Data":"7d4e4884c52cc3972dfbbd1d31046d01b1c4cacdfcf96ae577b69fac9cb7875e"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.076480 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-827fw" event={"ID":"49c0687d-9489-4429-9ef2-09f82f7df268","Type":"ContainerStarted","Data":"2befabac31cc68d2100a21ce36e0193d0d66eabd9370accf0045af0173f74ec0"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.091244 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" podStartSLOduration=125.091228643 podStartE2EDuration="2m5.091228643s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:26.085614456 +0000 UTC m=+145.389373080" watchObservedRunningTime="2025-12-01 10:33:26.091228643 +0000 UTC m=+145.394987267" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.099107 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.099171 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-utilities\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.099235 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-catalog-content\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.099267 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jl5\" (UniqueName: \"kubernetes.io/projected/a570f753-345e-40b8-a088-2d28ecf41896-kube-api-access-v2jl5\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.099883 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.599869069 +0000 UTC m=+145.903627693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.100757 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-utilities\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.100924 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-catalog-content\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.115738 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" event={"ID":"8555dd96-901c-4ef4-b63b-816d54e1489b","Type":"ContainerStarted","Data":"fe9dacfc79473b20727a04c37263f3b4f528940a54d90cf8c3e749c1dca6f861"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.129710 4761 generic.go:334] "Generic (PLEG): container finished" podID="c9161413-554f-4d53-bc23-efd48ff91a94" containerID="37c31277c639225a9bba38eb4fd8b24da58a3083e9c796419344734ffda93868" exitCode=0 Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.129775 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" event={"ID":"c9161413-554f-4d53-bc23-efd48ff91a94","Type":"ContainerDied","Data":"37c31277c639225a9bba38eb4fd8b24da58a3083e9c796419344734ffda93868"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.146725 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" event={"ID":"f1827035-d23f-4436-96ee-f363b9ea9022","Type":"ContainerStarted","Data":"6c5454db18f7fc7fe6572a3bf092c56ee6c85a3264f9e24423e071e959b60d40"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.146772 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" event={"ID":"f1827035-d23f-4436-96ee-f363b9ea9022","Type":"ContainerStarted","Data":"26c731770a75e3a22a6ab12f2269a34e262b9d3bf0b0b4a926e0667bb264ee29"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.156595 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-q54z4" podStartSLOduration=125.156578913 podStartE2EDuration="2m5.156578913s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:26.150994677 +0000 UTC m=+145.454753301" watchObservedRunningTime="2025-12-01 10:33:26.156578913 +0000 UTC m=+145.460337537" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.168775 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xf5wg" event={"ID":"2248852d-ea7a-49eb-bbfc-c87aa7f6c597","Type":"ContainerStarted","Data":"864e83055775104c4f02f466bc0f53baf387a081c653ba7272cd8b5635812811"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.169350 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.201446 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jl5\" (UniqueName: \"kubernetes.io/projected/a570f753-345e-40b8-a088-2d28ecf41896-kube-api-access-v2jl5\") pod \"certified-operators-qh6dn\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.201912 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.202943 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.702927208 +0000 UTC m=+146.006685832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.256704 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" event={"ID":"3454eba8-593e-4647-8b4b-71e0f432ffeb","Type":"ContainerStarted","Data":"b309d77a12b8673272712257ce48e3a83d7671610581289efc7600730d6074a6"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.273811 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.312857 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.314466 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.814444078 +0000 UTC m=+146.118202702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.316789 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" podUID="52d03758-6fb1-4040-ae86-d2a89d6cc88f" containerName="controller-manager" containerID="cri-o://6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4" gracePeriod=30 Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.340821 4761 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2xx98 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.340939 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.344404 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" event={"ID":"0a304a57-2fa4-477c-8d57-4f411e4f8790","Type":"ContainerStarted","Data":"bfb1b69f2c2b9dd3ca996d99464bb442497e2762c5bbf81a0b7a9e1339cbbab1"} Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.410342 4761 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hxs9g container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.410399 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" podUID="e3ce8fad-1931-4034-96f9-6b9750665a36" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.410676 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k46jr" podStartSLOduration=125.410658364 podStartE2EDuration="2m5.410658364s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:26.2552254 +0000 UTC m=+145.558984044" watchObservedRunningTime="2025-12-01 10:33:26.410658364 +0000 UTC m=+145.714416988" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.412053 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" podStartSLOduration=126.412045325 podStartE2EDuration="2m6.412045325s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:26.408807849 +0000 UTC m=+145.712566473" watchObservedRunningTime="2025-12-01 10:33:26.412045325 +0000 UTC m=+145.715803949" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.416060 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gqb5c" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.421175 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.421799 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.921783004 +0000 UTC m=+146.225541628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.423232 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7mkxr" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.435167 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.437161 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:26.93714486 +0000 UTC m=+146.240903484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.453483 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-std2v" podStartSLOduration=125.453464834 podStartE2EDuration="2m5.453464834s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:26.453083523 +0000 UTC m=+145.756842147" watchObservedRunningTime="2025-12-01 10:33:26.453464834 +0000 UTC m=+145.757223458" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.499758 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" podStartSLOduration=126.499739088 podStartE2EDuration="2m6.499739088s" podCreationTimestamp="2025-12-01 10:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:26.497499641 +0000 UTC m=+145.801258265" watchObservedRunningTime="2025-12-01 10:33:26.499739088 +0000 UTC m=+145.803497712" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.538101 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.552021 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.051998429 +0000 UTC m=+146.355757053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.583495 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xf5wg" podStartSLOduration=9.583475973 podStartE2EDuration="9.583475973s" podCreationTimestamp="2025-12-01 10:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:26.536622922 +0000 UTC m=+145.840381546" watchObservedRunningTime="2025-12-01 10:33:26.583475973 +0000 UTC m=+145.887234597" Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.642469 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.642775 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.142762663 +0000 UTC m=+146.446521287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.746624 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.747270 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.247254294 +0000 UTC m=+146.551012918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.779006 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nztd7"] Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.784619 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:26 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:26 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:26 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.784679 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:26 crc kubenswrapper[4761]: W1201 10:33:26.789015 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe88ace_f487_4b05_a9de_d5bdd2945c75.slice/crio-33bb00899968ff73a467bd2fc6f2f8bc5ba037aa9871c2382ea83ceb944cb130 WatchSource:0}: Error finding container 33bb00899968ff73a467bd2fc6f2f8bc5ba037aa9871c2382ea83ceb944cb130: Status 404 returned error can't find the container with id 33bb00899968ff73a467bd2fc6f2f8bc5ba037aa9871c2382ea83ceb944cb130 Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.847939 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.848589 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.348577341 +0000 UTC m=+146.652335965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.936212 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5c886"] Dec 01 10:33:26 crc kubenswrapper[4761]: I1201 10:33:26.950079 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:26 crc kubenswrapper[4761]: E1201 10:33:26.950426 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.450411583 +0000 UTC m=+146.754170207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.046177 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qh6dn"] Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.053764 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:27 crc kubenswrapper[4761]: E1201 10:33:27.054064 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.554051359 +0000 UTC m=+146.857809983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:27 crc kubenswrapper[4761]: W1201 10:33:27.141900 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda570f753_345e_40b8_a088_2d28ecf41896.slice/crio-e00084b4e8903b70791b0911590c919d23818d52b46b4c07c64bf62ba4675816 WatchSource:0}: Error finding container e00084b4e8903b70791b0911590c919d23818d52b46b4c07c64bf62ba4675816: Status 404 returned error can't find the container with id e00084b4e8903b70791b0911590c919d23818d52b46b4c07c64bf62ba4675816 Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.156669 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:27 crc kubenswrapper[4761]: E1201 10:33:27.156928 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.656914322 +0000 UTC m=+146.960672946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.161081 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.258375 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:27 crc kubenswrapper[4761]: E1201 10:33:27.258711 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.758699442 +0000 UTC m=+147.062458066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.299857 4761 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.365957 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52d03758-6fb1-4040-ae86-d2a89d6cc88f-serving-cert\") pod \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.366410 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.366512 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-proxy-ca-bundles\") pod \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " Dec 01 10:33:27 crc kubenswrapper[4761]: E1201 10:33:27.366643 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.866618586 +0000 UTC m=+147.170377210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.367193 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-client-ca" (OuterVolumeSpecName: "client-ca") pod "52d03758-6fb1-4040-ae86-d2a89d6cc88f" (UID: "52d03758-6fb1-4040-ae86-d2a89d6cc88f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.367211 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "52d03758-6fb1-4040-ae86-d2a89d6cc88f" (UID: "52d03758-6fb1-4040-ae86-d2a89d6cc88f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.367246 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-client-ca\") pod \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.367300 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4wzb\" (UniqueName: \"kubernetes.io/projected/52d03758-6fb1-4040-ae86-d2a89d6cc88f-kube-api-access-r4wzb\") pod \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.367606 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-config\") pod \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\" (UID: \"52d03758-6fb1-4040-ae86-d2a89d6cc88f\") " Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.368116 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.368190 4761 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.368206 4761 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:27 crc kubenswrapper[4761]: E1201 10:33:27.368440 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.868429179 +0000 UTC m=+147.172187803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5s745" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.369206 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-config" (OuterVolumeSpecName: "config") pod "52d03758-6fb1-4040-ae86-d2a89d6cc88f" (UID: "52d03758-6fb1-4040-ae86-d2a89d6cc88f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.376133 4761 generic.go:334] "Generic (PLEG): container finished" podID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerID="52a8b80f6539ce43d2586d052feb8e21ce9e877b977d2af82cf6c3fcc96780f5" exitCode=0 Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.376418 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nztd7" event={"ID":"3fe88ace-f487-4b05-a9de-d5bdd2945c75","Type":"ContainerDied","Data":"52a8b80f6539ce43d2586d052feb8e21ce9e877b977d2af82cf6c3fcc96780f5"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.376477 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nztd7" event={"ID":"3fe88ace-f487-4b05-a9de-d5bdd2945c75","Type":"ContainerStarted","Data":"33bb00899968ff73a467bd2fc6f2f8bc5ba037aa9871c2382ea83ceb944cb130"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.377459 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d03758-6fb1-4040-ae86-d2a89d6cc88f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52d03758-6fb1-4040-ae86-d2a89d6cc88f" (UID: "52d03758-6fb1-4040-ae86-d2a89d6cc88f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.380854 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d03758-6fb1-4040-ae86-d2a89d6cc88f-kube-api-access-r4wzb" (OuterVolumeSpecName: "kube-api-access-r4wzb") pod "52d03758-6fb1-4040-ae86-d2a89d6cc88f" (UID: "52d03758-6fb1-4040-ae86-d2a89d6cc88f"). InnerVolumeSpecName "kube-api-access-r4wzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.385741 4761 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.393724 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-npj9f" event={"ID":"4a70f5c2-aba5-46bb-a96b-da503d30e66e","Type":"ContainerStarted","Data":"6c1387b891a056da40076e8a2d1958bc6c0bf64624cecd6da2ca96b98c3f5410"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.410664 4761 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T10:33:27.299884925Z","Handler":null,"Name":""} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.412932 4761 generic.go:334] "Generic (PLEG): container finished" podID="52d03758-6fb1-4040-ae86-d2a89d6cc88f" containerID="6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4" exitCode=0 Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.413042 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" event={"ID":"52d03758-6fb1-4040-ae86-d2a89d6cc88f","Type":"ContainerDied","Data":"6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.413073 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" event={"ID":"52d03758-6fb1-4040-ae86-d2a89d6cc88f","Type":"ContainerDied","Data":"6291c97bcd9c6acb8976631188cfcb7f77f0d1f37ca4ee1c536b97964249160a"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.413094 4761 scope.go:117] "RemoveContainer" containerID="6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.413265 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pgskl" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.436704 4761 generic.go:334] "Generic (PLEG): container finished" podID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerID="5fd1fad99bb3a8f8e25ec56d46863c20df66ec101b60c4f8ec0d16dc8b55bbe8" exitCode=0 Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.438564 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d99sk" event={"ID":"1e69dab2-4c11-4352-95c8-92499a4c5a75","Type":"ContainerDied","Data":"5fd1fad99bb3a8f8e25ec56d46863c20df66ec101b60c4f8ec0d16dc8b55bbe8"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.438597 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d99sk" event={"ID":"1e69dab2-4c11-4352-95c8-92499a4c5a75","Type":"ContainerStarted","Data":"ba5c15a400a6d4aed00da4d3c2b64b13f6aab31e49790a40f6b59b4e5595b686"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.470250 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5c886" event={"ID":"5821e59d-de93-43fd-822d-83128ce780de","Type":"ContainerStarted","Data":"c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.470326 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5c886" event={"ID":"5821e59d-de93-43fd-822d-83128ce780de","Type":"ContainerStarted","Data":"f29837df7d9075929fe3cc29214ed5a4ea3fe80e9ac3d92e2eb41a26d1d429bc"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.487841 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:27 crc kubenswrapper[4761]: E1201 10:33:27.494973 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:27.994945504 +0000 UTC m=+147.298704128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.495359 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4wzb\" (UniqueName: \"kubernetes.io/projected/52d03758-6fb1-4040-ae86-d2a89d6cc88f-kube-api-access-r4wzb\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.495388 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52d03758-6fb1-4040-ae86-d2a89d6cc88f-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.495398 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52d03758-6fb1-4040-ae86-d2a89d6cc88f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.505758 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh6dn" event={"ID":"a570f753-345e-40b8-a088-2d28ecf41896","Type":"ContainerStarted","Data":"e00084b4e8903b70791b0911590c919d23818d52b46b4c07c64bf62ba4675816"} Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.507009 4761 scope.go:117] "RemoveContainer" containerID="6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.511726 4761 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2xx98 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.511779 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 01 10:33:27 crc kubenswrapper[4761]: E1201 10:33:27.511784 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4\": container with ID starting with 6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4 not found: ID does not exist" containerID="6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.511847 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4"} err="failed to get container status \"6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4\": rpc error: code = NotFound desc = could not find container \"6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4\": container with ID starting with 6d4b58a147a74300997f892962712963260ccf5b10ecc03e9573081800ad93a4 not found: ID does not exist" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.514835 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fvkf2"] Dec 01 10:33:27 crc kubenswrapper[4761]: E1201 10:33:27.515203 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d03758-6fb1-4040-ae86-d2a89d6cc88f" containerName="controller-manager" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.515226 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d03758-6fb1-4040-ae86-d2a89d6cc88f" containerName="controller-manager" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.517157 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d03758-6fb1-4040-ae86-d2a89d6cc88f" containerName="controller-manager" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.518249 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.533265 4761 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.533297 4761 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.544873 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.549482 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvkf2"] Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.582682 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pgskl"] Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.582733 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pgskl"] Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.606725 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.606791 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqdsj\" (UniqueName: \"kubernetes.io/projected/0e51c452-5010-4af5-bb69-941565926337-kube-api-access-vqdsj\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.606992 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-catalog-content\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.607099 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-utilities\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.622470 4761 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.622762 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.622920 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hxs9g" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.708884 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqdsj\" (UniqueName: \"kubernetes.io/projected/0e51c452-5010-4af5-bb69-941565926337-kube-api-access-vqdsj\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.708953 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-catalog-content\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.708982 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-utilities\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.709366 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-utilities\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.709739 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-catalog-content\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.733304 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqdsj\" (UniqueName: \"kubernetes.io/projected/0e51c452-5010-4af5-bb69-941565926337-kube-api-access-vqdsj\") pod \"redhat-marketplace-fvkf2\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.778701 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:27 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:27 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:27 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.778755 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.807921 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5s745\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.876883 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.886191 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nktzw"] Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.887140 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.902802 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktzw"] Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.912065 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.912190 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-utilities\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.912229 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.912247 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-catalog-content\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.912272 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6cg\" (UniqueName: \"kubernetes.io/projected/95b42196-1572-4e3f-b807-4ef64ed3311f-kube-api-access-qh6cg\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.912321 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.931015 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.937926 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.941139 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.953648 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpsf2"] Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.954334 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.963682 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.967615 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpsf2"] Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.968521 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.973100 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.973325 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.973434 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.973531 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.974437 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:33:27 crc kubenswrapper[4761]: I1201 10:33:27.985318 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.013140 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.013204 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-utilities\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.013233 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-catalog-content\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.013257 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6cg\" (UniqueName: \"kubernetes.io/projected/95b42196-1572-4e3f-b807-4ef64ed3311f-kube-api-access-qh6cg\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.013280 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.014006 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.016887 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v6dsn" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.017600 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-catalog-content\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.017814 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-utilities\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.021429 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.067264 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6cg\" (UniqueName: \"kubernetes.io/projected/95b42196-1572-4e3f-b807-4ef64ed3311f-kube-api-access-qh6cg\") pod \"redhat-marketplace-nktzw\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.095242 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.115822 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lwfm\" (UniqueName: \"kubernetes.io/projected/983972ee-4dc5-4a52-9087-d69d4362b33d-kube-api-access-6lwfm\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.115880 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/983972ee-4dc5-4a52-9087-d69d4362b33d-serving-cert\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.115910 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-config\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.115956 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-client-ca\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.116002 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.174931 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.197792 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.210210 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.223365 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9161413-554f-4d53-bc23-efd48ff91a94-config-volume\") pod \"c9161413-554f-4d53-bc23-efd48ff91a94\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.223405 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8bcm\" (UniqueName: \"kubernetes.io/projected/c9161413-554f-4d53-bc23-efd48ff91a94-kube-api-access-s8bcm\") pod \"c9161413-554f-4d53-bc23-efd48ff91a94\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.223457 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9161413-554f-4d53-bc23-efd48ff91a94-secret-volume\") pod \"c9161413-554f-4d53-bc23-efd48ff91a94\" (UID: \"c9161413-554f-4d53-bc23-efd48ff91a94\") " Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.223724 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-config\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.223747 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-client-ca\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.224101 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.224145 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lwfm\" (UniqueName: \"kubernetes.io/projected/983972ee-4dc5-4a52-9087-d69d4362b33d-kube-api-access-6lwfm\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.224183 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/983972ee-4dc5-4a52-9087-d69d4362b33d-serving-cert\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.241495 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-client-ca\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.259320 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/983972ee-4dc5-4a52-9087-d69d4362b33d-serving-cert\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.265405 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.269044 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.275123 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lwfm\" (UniqueName: \"kubernetes.io/projected/983972ee-4dc5-4a52-9087-d69d4362b33d-kube-api-access-6lwfm\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.276318 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-config\") pod \"controller-manager-879f6c89f-vpsf2\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.276758 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9161413-554f-4d53-bc23-efd48ff91a94-kube-api-access-s8bcm" (OuterVolumeSpecName: "kube-api-access-s8bcm") pod "c9161413-554f-4d53-bc23-efd48ff91a94" (UID: "c9161413-554f-4d53-bc23-efd48ff91a94"). InnerVolumeSpecName "kube-api-access-s8bcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.281159 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9161413-554f-4d53-bc23-efd48ff91a94-config-volume" (OuterVolumeSpecName: "config-volume") pod "c9161413-554f-4d53-bc23-efd48ff91a94" (UID: "c9161413-554f-4d53-bc23-efd48ff91a94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.293491 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9161413-554f-4d53-bc23-efd48ff91a94-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c9161413-554f-4d53-bc23-efd48ff91a94" (UID: "c9161413-554f-4d53-bc23-efd48ff91a94"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.326655 4761 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9161413-554f-4d53-bc23-efd48ff91a94-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.326702 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8bcm\" (UniqueName: \"kubernetes.io/projected/c9161413-554f-4d53-bc23-efd48ff91a94-kube-api-access-s8bcm\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.326715 4761 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9161413-554f-4d53-bc23-efd48ff91a94-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.346191 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.423900 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvkf2"] Dec 01 10:33:28 crc kubenswrapper[4761]: W1201 10:33:28.457070 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e51c452_5010_4af5_bb69_941565926337.slice/crio-abdd31e9a06bc11898da71133f323843ae38c9eb2dbe6a7242575de4417f25eb WatchSource:0}: Error finding container abdd31e9a06bc11898da71133f323843ae38c9eb2dbe6a7242575de4417f25eb: Status 404 returned error can't find the container with id abdd31e9a06bc11898da71133f323843ae38c9eb2dbe6a7242575de4417f25eb Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.482225 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-55clp"] Dec 01 10:33:28 crc kubenswrapper[4761]: E1201 10:33:28.482663 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9161413-554f-4d53-bc23-efd48ff91a94" containerName="collect-profiles" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.482929 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9161413-554f-4d53-bc23-efd48ff91a94" containerName="collect-profiles" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.483084 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9161413-554f-4d53-bc23-efd48ff91a94" containerName="collect-profiles" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.483790 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.487041 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.503633 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5s745"] Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.506744 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-55clp"] Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.518297 4761 generic.go:334] "Generic (PLEG): container finished" podID="5821e59d-de93-43fd-822d-83128ce780de" containerID="c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed" exitCode=0 Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.518474 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5c886" event={"ID":"5821e59d-de93-43fd-822d-83128ce780de","Type":"ContainerDied","Data":"c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed"} Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.528686 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-utilities\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.528890 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-catalog-content\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.529032 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldm8d\" (UniqueName: \"kubernetes.io/projected/6500351a-78de-4cb9-bc74-12a450bbc76e-kube-api-access-ldm8d\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.529580 4761 generic.go:334] "Generic (PLEG): container finished" podID="a570f753-345e-40b8-a088-2d28ecf41896" containerID="9181ab0230eb069f1ac6797499b08b1f03ac9ff586e2363655797276331fd723" exitCode=0 Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.529714 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh6dn" event={"ID":"a570f753-345e-40b8-a088-2d28ecf41896","Type":"ContainerDied","Data":"9181ab0230eb069f1ac6797499b08b1f03ac9ff586e2363655797276331fd723"} Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.532652 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-npj9f" event={"ID":"4a70f5c2-aba5-46bb-a96b-da503d30e66e","Type":"ContainerStarted","Data":"a9998f11981f083d760c639f5a88cf8985284f8a1f9e564d814417902c5d6fd4"} Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.539183 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvkf2" event={"ID":"0e51c452-5010-4af5-bb69-941565926337","Type":"ContainerStarted","Data":"abdd31e9a06bc11898da71133f323843ae38c9eb2dbe6a7242575de4417f25eb"} Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.554462 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" event={"ID":"c9161413-554f-4d53-bc23-efd48ff91a94","Type":"ContainerDied","Data":"0498acb76d625d33b28f6fb5d99b4e3b4baea54db9534f577751d76dafb2c123"} Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.554513 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0498acb76d625d33b28f6fb5d99b4e3b4baea54db9534f577751d76dafb2c123" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.554597 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-5ncx6" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.638711 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-catalog-content\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.639504 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldm8d\" (UniqueName: \"kubernetes.io/projected/6500351a-78de-4cb9-bc74-12a450bbc76e-kube-api-access-ldm8d\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.639765 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-utilities\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.639376 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-catalog-content\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.642656 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-utilities\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.679775 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldm8d\" (UniqueName: \"kubernetes.io/projected/6500351a-78de-4cb9-bc74-12a450bbc76e-kube-api-access-ldm8d\") pod \"redhat-operators-55clp\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.739428 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.777101 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:28 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:28 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:28 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.778025 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.875733 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2wblg"] Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.878719 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.899788 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2wblg"] Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.927281 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktzw"] Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.950359 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-catalog-content\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.950695 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9jr\" (UniqueName: \"kubernetes.io/projected/6ba40ceb-381a-41e3-8d11-ed171d07ee74-kube-api-access-hc9jr\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:28 crc kubenswrapper[4761]: I1201 10:33:28.950760 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-utilities\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:29 crc kubenswrapper[4761]: W1201 10:33:29.003997 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-371a9160e1b7ee3f7b95de338422c0cb6f6bab357d5848c5b4cda4198f840bbd WatchSource:0}: Error finding container 371a9160e1b7ee3f7b95de338422c0cb6f6bab357d5848c5b4cda4198f840bbd: Status 404 returned error can't find the container with id 371a9160e1b7ee3f7b95de338422c0cb6f6bab357d5848c5b4cda4198f840bbd Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.051697 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-catalog-content\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.051734 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9jr\" (UniqueName: \"kubernetes.io/projected/6ba40ceb-381a-41e3-8d11-ed171d07ee74-kube-api-access-hc9jr\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.051805 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-utilities\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.052182 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-utilities\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.052378 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-catalog-content\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.075821 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9jr\" (UniqueName: \"kubernetes.io/projected/6ba40ceb-381a-41e3-8d11-ed171d07ee74-kube-api-access-hc9jr\") pod \"redhat-operators-2wblg\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:29 crc kubenswrapper[4761]: W1201 10:33:29.076227 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-7ad773ef2cc1545dc6bc925abf23b867c365d8345416523ea96cad4312222760 WatchSource:0}: Error finding container 7ad773ef2cc1545dc6bc925abf23b867c365d8345416523ea96cad4312222760: Status 404 returned error can't find the container with id 7ad773ef2cc1545dc6bc925abf23b867c365d8345416523ea96cad4312222760 Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.082514 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-55clp"] Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.161791 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d03758-6fb1-4040-ae86-d2a89d6cc88f" path="/var/lib/kubelet/pods/52d03758-6fb1-4040-ae86-d2a89d6cc88f/volumes" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.165897 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.166468 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpsf2"] Dec 01 10:33:29 crc kubenswrapper[4761]: W1201 10:33:29.199954 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6500351a_78de_4cb9_bc74_12a450bbc76e.slice/crio-00c3d5d746c5cff512d6ef28384c5e8380e7ad7e5c1e68f896d756a425a50ec8 WatchSource:0}: Error finding container 00c3d5d746c5cff512d6ef28384c5e8380e7ad7e5c1e68f896d756a425a50ec8: Status 404 returned error can't find the container with id 00c3d5d746c5cff512d6ef28384c5e8380e7ad7e5c1e68f896d756a425a50ec8 Dec 01 10:33:29 crc kubenswrapper[4761]: W1201 10:33:29.208606 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod983972ee_4dc5_4a52_9087_d69d4362b33d.slice/crio-5b2e2295d347096af2be3e4789f6ba678118a1dfe8d2e1a7a2de336528385972 WatchSource:0}: Error finding container 5b2e2295d347096af2be3e4789f6ba678118a1dfe8d2e1a7a2de336528385972: Status 404 returned error can't find the container with id 5b2e2295d347096af2be3e4789f6ba678118a1dfe8d2e1a7a2de336528385972 Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.215994 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.560669 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2wblg"] Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.568437 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1045350753e10e320e1659ff5f743be640283c1bad48988e5fea79cde59fbcd2"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.568486 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7ad773ef2cc1545dc6bc925abf23b867c365d8345416523ea96cad4312222760"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.569323 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.571543 4761 generic.go:334] "Generic (PLEG): container finished" podID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerID="20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84" exitCode=0 Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.571609 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktzw" event={"ID":"95b42196-1572-4e3f-b807-4ef64ed3311f","Type":"ContainerDied","Data":"20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.571625 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktzw" event={"ID":"95b42196-1572-4e3f-b807-4ef64ed3311f","Type":"ContainerStarted","Data":"2afe4f14967b1a8a0bdb76005f3dccdc7829cc495c2b5fb169d9e55fc297a6bd"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.573532 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"61666162c7dcd58a5f6c22a8ffbd0b811968e2267c050e1f31a874d30f8203d0"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.573570 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"371a9160e1b7ee3f7b95de338422c0cb6f6bab357d5848c5b4cda4198f840bbd"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.574704 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" event={"ID":"c5615f9d-052a-4910-8050-d39d2d9dde06","Type":"ContainerStarted","Data":"764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.574726 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" event={"ID":"c5615f9d-052a-4910-8050-d39d2d9dde06","Type":"ContainerStarted","Data":"a427a7efb09d8c84ec4889f6be235894b93b36fe41394281857409e960e9ece5"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.575051 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.583044 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-npj9f" event={"ID":"4a70f5c2-aba5-46bb-a96b-da503d30e66e","Type":"ContainerStarted","Data":"b06e5bb8fd8454ef87ae87feafa4c919391c926d43ea6850f245e59b176d9515"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.585888 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" event={"ID":"983972ee-4dc5-4a52-9087-d69d4362b33d","Type":"ContainerStarted","Data":"435a9e4af327ad4f13b8f52357e4af2165758f8c91ccec01cd70165e692ee672"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.585931 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.585942 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" event={"ID":"983972ee-4dc5-4a52-9087-d69d4362b33d","Type":"ContainerStarted","Data":"5b2e2295d347096af2be3e4789f6ba678118a1dfe8d2e1a7a2de336528385972"} Dec 01 10:33:29 crc kubenswrapper[4761]: W1201 10:33:29.590340 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba40ceb_381a_41e3_8d11_ed171d07ee74.slice/crio-492bd18f07d1464448de0c57f56b3b5c8e0fea7676ae9cc3be1a370ec35e3b50 WatchSource:0}: Error finding container 492bd18f07d1464448de0c57f56b3b5c8e0fea7676ae9cc3be1a370ec35e3b50: Status 404 returned error can't find the container with id 492bd18f07d1464448de0c57f56b3b5c8e0fea7676ae9cc3be1a370ec35e3b50 Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.593626 4761 generic.go:334] "Generic (PLEG): container finished" podID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerID="cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce" exitCode=0 Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.593685 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55clp" event={"ID":"6500351a-78de-4cb9-bc74-12a450bbc76e","Type":"ContainerDied","Data":"cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.593708 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55clp" event={"ID":"6500351a-78de-4cb9-bc74-12a450bbc76e","Type":"ContainerStarted","Data":"00c3d5d746c5cff512d6ef28384c5e8380e7ad7e5c1e68f896d756a425a50ec8"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.593850 4761 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vpsf2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.593884 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" podUID="983972ee-4dc5-4a52-9087-d69d4362b33d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.598325 4761 generic.go:334] "Generic (PLEG): container finished" podID="0e51c452-5010-4af5-bb69-941565926337" containerID="4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f" exitCode=0 Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.598761 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvkf2" event={"ID":"0e51c452-5010-4af5-bb69-941565926337","Type":"ContainerDied","Data":"4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.601504 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f65a1ded99a5f228dff41cbbd0c4f896ed27142915d21d1052c4a35d2a02c804"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.601528 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1b19a77ef614e133823d0dde3aa7f994ba6a202c33c6c7c7227a74c12ecd2b91"} Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.615120 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" podStartSLOduration=3.61510474 podStartE2EDuration="3.61510474s" podCreationTimestamp="2025-12-01 10:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:29.614482962 +0000 UTC m=+148.918241596" watchObservedRunningTime="2025-12-01 10:33:29.61510474 +0000 UTC m=+148.918863354" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.660294 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" podStartSLOduration=128.660274521 podStartE2EDuration="2m8.660274521s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:29.639905377 +0000 UTC m=+148.943664001" watchObservedRunningTime="2025-12-01 10:33:29.660274521 +0000 UTC m=+148.964033145" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.686390 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-npj9f" podStartSLOduration=12.686368076 podStartE2EDuration="12.686368076s" podCreationTimestamp="2025-12-01 10:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:29.663834827 +0000 UTC m=+148.967593441" watchObservedRunningTime="2025-12-01 10:33:29.686368076 +0000 UTC m=+148.990126700" Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.776710 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:29 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:29 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:29 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:29 crc kubenswrapper[4761]: I1201 10:33:29.776782 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.229489 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.229878 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.235630 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.347480 4761 patch_prober.go:28] interesting pod/downloads-7954f5f757-fqctr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.347537 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fqctr" podUID="7713e7f9-1a0c-448b-9814-c143fdd040ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.347538 4761 patch_prober.go:28] interesting pod/downloads-7954f5f757-fqctr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.347631 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fqctr" podUID="7713e7f9-1a0c-448b-9814-c143fdd040ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.347977 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.348017 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.349731 4761 patch_prober.go:28] interesting pod/console-f9d7485db-bz89h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.349769 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bz89h" podUID="0de6067f-4bc2-4265-bb7f-e595f6060033" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.435436 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.436069 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.443271 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.443668 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.446885 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.485133 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.485184 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.588054 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.588124 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.588270 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.624235 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.650920 4761 generic.go:334] "Generic (PLEG): container finished" podID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerID="bac0e110bd962c1c9c78530027d0b0c307456868e8728559ea9fd8c02a00a78f" exitCode=0 Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.652138 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wblg" event={"ID":"6ba40ceb-381a-41e3-8d11-ed171d07ee74","Type":"ContainerDied","Data":"bac0e110bd962c1c9c78530027d0b0c307456868e8728559ea9fd8c02a00a78f"} Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.652167 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wblg" event={"ID":"6ba40ceb-381a-41e3-8d11-ed171d07ee74","Type":"ContainerStarted","Data":"492bd18f07d1464448de0c57f56b3b5c8e0fea7676ae9cc3be1a370ec35e3b50"} Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.658213 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.659034 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tfh9j" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.762577 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.774586 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.784592 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:30 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:30 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:30 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.784635 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:30 crc kubenswrapper[4761]: I1201 10:33:30.847825 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:33:31 crc kubenswrapper[4761]: I1201 10:33:31.401405 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:33:31 crc kubenswrapper[4761]: W1201 10:33:31.427132 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcbe15d94_ad40_45d1_938a_c5f67f74ce8e.slice/crio-14df7694a041135f691af00d3df5a70a51a6598804a544948c0594ba1c11d7ea WatchSource:0}: Error finding container 14df7694a041135f691af00d3df5a70a51a6598804a544948c0594ba1c11d7ea: Status 404 returned error can't find the container with id 14df7694a041135f691af00d3df5a70a51a6598804a544948c0594ba1c11d7ea Dec 01 10:33:31 crc kubenswrapper[4761]: I1201 10:33:31.705117 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cbe15d94-ad40-45d1-938a-c5f67f74ce8e","Type":"ContainerStarted","Data":"14df7694a041135f691af00d3df5a70a51a6598804a544948c0594ba1c11d7ea"} Dec 01 10:33:31 crc kubenswrapper[4761]: I1201 10:33:31.789521 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:31 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:31 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:31 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:31 crc kubenswrapper[4761]: I1201 10:33:31.789604 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:32 crc kubenswrapper[4761]: I1201 10:33:32.716317 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cbe15d94-ad40-45d1-938a-c5f67f74ce8e","Type":"ContainerStarted","Data":"d2dbb52de9313805b54f66b5c1912163b0f4685ac7ce83321b9288af30da0212"} Dec 01 10:33:32 crc kubenswrapper[4761]: I1201 10:33:32.735355 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.735334837 podStartE2EDuration="2.735334837s" podCreationTimestamp="2025-12-01 10:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:32.732114462 +0000 UTC m=+152.035873086" watchObservedRunningTime="2025-12-01 10:33:32.735334837 +0000 UTC m=+152.039093461" Dec 01 10:33:32 crc kubenswrapper[4761]: I1201 10:33:32.775378 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:32 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:32 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:32 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:32 crc kubenswrapper[4761]: I1201 10:33:32.775426 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.579689 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.581307 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.583047 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.583504 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.585251 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.704266 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.704354 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.738517 4761 generic.go:334] "Generic (PLEG): container finished" podID="cbe15d94-ad40-45d1-938a-c5f67f74ce8e" containerID="d2dbb52de9313805b54f66b5c1912163b0f4685ac7ce83321b9288af30da0212" exitCode=0 Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.738942 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cbe15d94-ad40-45d1-938a-c5f67f74ce8e","Type":"ContainerDied","Data":"d2dbb52de9313805b54f66b5c1912163b0f4685ac7ce83321b9288af30da0212"} Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.776497 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:33 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:33 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:33 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.776566 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.805321 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.805392 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.805456 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.839211 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.849933 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.849983 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:33:33 crc kubenswrapper[4761]: I1201 10:33:33.909945 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:34 crc kubenswrapper[4761]: I1201 10:33:34.515614 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:33:34 crc kubenswrapper[4761]: I1201 10:33:34.773729 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0","Type":"ContainerStarted","Data":"60f75e2f5c8fc40a4d04a5a287df166b6a70291ec81d64e3f909a58cd8a02106"} Dec 01 10:33:34 crc kubenswrapper[4761]: I1201 10:33:34.775886 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:34 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:34 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:34 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:34 crc kubenswrapper[4761]: I1201 10:33:34.775940 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.214282 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.353726 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kube-api-access\") pod \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\" (UID: \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\") " Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.353916 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kubelet-dir\") pod \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\" (UID: \"cbe15d94-ad40-45d1-938a-c5f67f74ce8e\") " Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.354792 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cbe15d94-ad40-45d1-938a-c5f67f74ce8e" (UID: "cbe15d94-ad40-45d1-938a-c5f67f74ce8e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.364527 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cbe15d94-ad40-45d1-938a-c5f67f74ce8e" (UID: "cbe15d94-ad40-45d1-938a-c5f67f74ce8e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.455761 4761 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.455788 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbe15d94-ad40-45d1-938a-c5f67f74ce8e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.777242 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:35 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:35 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:35 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.777291 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.790130 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0","Type":"ContainerStarted","Data":"28a4daa65aea8ad6c58605b7bb37c731b4415b5523321c876cd011f222ea2355"} Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.794325 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cbe15d94-ad40-45d1-938a-c5f67f74ce8e","Type":"ContainerDied","Data":"14df7694a041135f691af00d3df5a70a51a6598804a544948c0594ba1c11d7ea"} Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.794364 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14df7694a041135f691af00d3df5a70a51a6598804a544948c0594ba1c11d7ea" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.794374 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.815076 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.815057923 podStartE2EDuration="2.815057923s" podCreationTimestamp="2025-12-01 10:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:35.801437449 +0000 UTC m=+155.105196073" watchObservedRunningTime="2025-12-01 10:33:35.815057923 +0000 UTC m=+155.118816547" Dec 01 10:33:35 crc kubenswrapper[4761]: I1201 10:33:35.989651 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xf5wg" Dec 01 10:33:36 crc kubenswrapper[4761]: I1201 10:33:36.786063 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:36 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:36 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:36 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:36 crc kubenswrapper[4761]: I1201 10:33:36.787366 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:36 crc kubenswrapper[4761]: I1201 10:33:36.809290 4761 generic.go:334] "Generic (PLEG): container finished" podID="28dc4b8d-a15b-4ec3-9dc2-3def59debaa0" containerID="28a4daa65aea8ad6c58605b7bb37c731b4415b5523321c876cd011f222ea2355" exitCode=0 Dec 01 10:33:36 crc kubenswrapper[4761]: I1201 10:33:36.809334 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0","Type":"ContainerDied","Data":"28a4daa65aea8ad6c58605b7bb37c731b4415b5523321c876cd011f222ea2355"} Dec 01 10:33:37 crc kubenswrapper[4761]: I1201 10:33:37.775677 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:37 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:37 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:37 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:37 crc kubenswrapper[4761]: I1201 10:33:37.776013 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:37 crc kubenswrapper[4761]: I1201 10:33:37.820223 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-xvpkl_f1827035-d23f-4436-96ee-f363b9ea9022/cluster-samples-operator/0.log" Dec 01 10:33:37 crc kubenswrapper[4761]: I1201 10:33:37.820282 4761 generic.go:334] "Generic (PLEG): container finished" podID="f1827035-d23f-4436-96ee-f363b9ea9022" containerID="26c731770a75e3a22a6ab12f2269a34e262b9d3bf0b0b4a926e0667bb264ee29" exitCode=2 Dec 01 10:33:37 crc kubenswrapper[4761]: I1201 10:33:37.820386 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" event={"ID":"f1827035-d23f-4436-96ee-f363b9ea9022","Type":"ContainerDied","Data":"26c731770a75e3a22a6ab12f2269a34e262b9d3bf0b0b4a926e0667bb264ee29"} Dec 01 10:33:37 crc kubenswrapper[4761]: I1201 10:33:37.821114 4761 scope.go:117] "RemoveContainer" containerID="26c731770a75e3a22a6ab12f2269a34e262b9d3bf0b0b4a926e0667bb264ee29" Dec 01 10:33:38 crc kubenswrapper[4761]: I1201 10:33:38.781337 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:38 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:38 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:38 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:38 crc kubenswrapper[4761]: I1201 10:33:38.781701 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:39 crc kubenswrapper[4761]: I1201 10:33:39.777242 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:39 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:39 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:39 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:39 crc kubenswrapper[4761]: I1201 10:33:39.777308 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:40 crc kubenswrapper[4761]: I1201 10:33:40.350076 4761 patch_prober.go:28] interesting pod/console-f9d7485db-bz89h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 01 10:33:40 crc kubenswrapper[4761]: I1201 10:33:40.350124 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bz89h" podUID="0de6067f-4bc2-4265-bb7f-e595f6060033" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 01 10:33:40 crc kubenswrapper[4761]: I1201 10:33:40.366076 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-fqctr" Dec 01 10:33:40 crc kubenswrapper[4761]: I1201 10:33:40.783620 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:40 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:40 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:40 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:40 crc kubenswrapper[4761]: I1201 10:33:40.783686 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:41 crc kubenswrapper[4761]: I1201 10:33:41.775821 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:41 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:41 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:41 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:41 crc kubenswrapper[4761]: I1201 10:33:41.776127 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:42 crc kubenswrapper[4761]: I1201 10:33:42.777142 4761 patch_prober.go:28] interesting pod/router-default-5444994796-lpmsm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:42 crc kubenswrapper[4761]: [-]has-synced failed: reason withheld Dec 01 10:33:42 crc kubenswrapper[4761]: [+]process-running ok Dec 01 10:33:42 crc kubenswrapper[4761]: healthz check failed Dec 01 10:33:42 crc kubenswrapper[4761]: I1201 10:33:42.777215 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpmsm" podUID="e423ab17-2ba9-4b3a-8ff8-17c0addd9077" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:43 crc kubenswrapper[4761]: I1201 10:33:43.534851 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:43 crc kubenswrapper[4761]: I1201 10:33:43.548798 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65d0c868-c268-4723-9323-6937c06b4ea9-metrics-certs\") pod \"network-metrics-daemon-86rp7\" (UID: \"65d0c868-c268-4723-9323-6937c06b4ea9\") " pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:43 crc kubenswrapper[4761]: I1201 10:33:43.778268 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:43 crc kubenswrapper[4761]: I1201 10:33:43.782474 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lpmsm" Dec 01 10:33:43 crc kubenswrapper[4761]: I1201 10:33:43.783959 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-86rp7" Dec 01 10:33:47 crc kubenswrapper[4761]: I1201 10:33:47.950490 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:33:50 crc kubenswrapper[4761]: I1201 10:33:50.389197 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:50 crc kubenswrapper[4761]: I1201 10:33:50.393081 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-bz89h" Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.567487 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.717335 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kubelet-dir\") pod \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\" (UID: \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\") " Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.717466 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kube-api-access\") pod \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\" (UID: \"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0\") " Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.717458 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "28dc4b8d-a15b-4ec3-9dc2-3def59debaa0" (UID: "28dc4b8d-a15b-4ec3-9dc2-3def59debaa0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.717773 4761 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.730796 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "28dc4b8d-a15b-4ec3-9dc2-3def59debaa0" (UID: "28dc4b8d-a15b-4ec3-9dc2-3def59debaa0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.819130 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28dc4b8d-a15b-4ec3-9dc2-3def59debaa0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.921878 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28dc4b8d-a15b-4ec3-9dc2-3def59debaa0","Type":"ContainerDied","Data":"60f75e2f5c8fc40a4d04a5a287df166b6a70291ec81d64e3f909a58cd8a02106"} Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.921914 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:53 crc kubenswrapper[4761]: I1201 10:33:53.921918 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60f75e2f5c8fc40a4d04a5a287df166b6a70291ec81d64e3f909a58cd8a02106" Dec 01 10:33:58 crc kubenswrapper[4761]: E1201 10:33:58.870510 4761 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 10:33:58 crc kubenswrapper[4761]: E1201 10:33:58.871256 4761 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hc9jr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2wblg_openshift-marketplace(6ba40ceb-381a-41e3-8d11-ed171d07ee74): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:33:58 crc kubenswrapper[4761]: E1201 10:33:58.873694 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2wblg" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" Dec 01 10:33:58 crc kubenswrapper[4761]: E1201 10:33:58.878332 4761 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 10:33:58 crc kubenswrapper[4761]: E1201 10:33:58.878460 4761 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldm8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-55clp_openshift-marketplace(6500351a-78de-4cb9-bc74-12a450bbc76e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:33:58 crc kubenswrapper[4761]: E1201 10:33:58.879649 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-55clp" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" Dec 01 10:33:59 crc kubenswrapper[4761]: E1201 10:33:59.716444 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-55clp" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" Dec 01 10:33:59 crc kubenswrapper[4761]: E1201 10:33:59.716505 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2wblg" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" Dec 01 10:33:59 crc kubenswrapper[4761]: E1201 10:33:59.797321 4761 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 10:33:59 crc kubenswrapper[4761]: E1201 10:33:59.797472 4761 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qh6cg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nktzw_openshift-marketplace(95b42196-1572-4e3f-b807-4ef64ed3311f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:33:59 crc kubenswrapper[4761]: E1201 10:33:59.798690 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nktzw" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" Dec 01 10:34:01 crc kubenswrapper[4761]: E1201 10:34:01.038358 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nktzw" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" Dec 01 10:34:01 crc kubenswrapper[4761]: I1201 10:34:01.124612 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-852hr" Dec 01 10:34:02 crc kubenswrapper[4761]: E1201 10:34:02.397821 4761 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 10:34:02 crc kubenswrapper[4761]: E1201 10:34:02.398418 4761 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42248,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d99sk_openshift-marketplace(1e69dab2-4c11-4352-95c8-92499a4c5a75): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:34:02 crc kubenswrapper[4761]: E1201 10:34:02.399859 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d99sk" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" Dec 01 10:34:02 crc kubenswrapper[4761]: E1201 10:34:02.419017 4761 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 10:34:02 crc kubenswrapper[4761]: E1201 10:34:02.419130 4761 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8x8bg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5c886_openshift-marketplace(5821e59d-de93-43fd-822d-83128ce780de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:34:02 crc kubenswrapper[4761]: E1201 10:34:02.420277 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5c886" podUID="5821e59d-de93-43fd-822d-83128ce780de" Dec 01 10:34:02 crc kubenswrapper[4761]: I1201 10:34:02.811933 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-86rp7"] Dec 01 10:34:02 crc kubenswrapper[4761]: E1201 10:34:02.825043 4761 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e51c452_5010_4af5_bb69_941565926337.slice/crio-conmon-8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:34:02 crc kubenswrapper[4761]: I1201 10:34:02.997134 4761 generic.go:334] "Generic (PLEG): container finished" podID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerID="0c15fff14c4d45ab4037666824245d1dc5a49cc9c4bdd229320c571aae6fb389" exitCode=0 Dec 01 10:34:02 crc kubenswrapper[4761]: I1201 10:34:02.997218 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nztd7" event={"ID":"3fe88ace-f487-4b05-a9de-d5bdd2945c75","Type":"ContainerDied","Data":"0c15fff14c4d45ab4037666824245d1dc5a49cc9c4bdd229320c571aae6fb389"} Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.001990 4761 generic.go:334] "Generic (PLEG): container finished" podID="0e51c452-5010-4af5-bb69-941565926337" containerID="8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6" exitCode=0 Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.002031 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvkf2" event={"ID":"0e51c452-5010-4af5-bb69-941565926337","Type":"ContainerDied","Data":"8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6"} Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.003717 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-86rp7" event={"ID":"65d0c868-c268-4723-9323-6937c06b4ea9","Type":"ContainerStarted","Data":"62d83335901b8028fc9c69c5511fdae6e2a4c10f6a6de3b5d3acd97fad937219"} Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.008711 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-xvpkl_f1827035-d23f-4436-96ee-f363b9ea9022/cluster-samples-operator/0.log" Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.008771 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvpkl" event={"ID":"f1827035-d23f-4436-96ee-f363b9ea9022","Type":"ContainerStarted","Data":"a65cdc9423ede3fb38624da05ccdd1cd62a5388bbc737ab3fe7fafbeba3be11b"} Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.012286 4761 generic.go:334] "Generic (PLEG): container finished" podID="a570f753-345e-40b8-a088-2d28ecf41896" containerID="4dabd376668d1e1e57ec339e216278ab32bcc7f0de5ac4658646aa8eee0119ab" exitCode=0 Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.012346 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh6dn" event={"ID":"a570f753-345e-40b8-a088-2d28ecf41896","Type":"ContainerDied","Data":"4dabd376668d1e1e57ec339e216278ab32bcc7f0de5ac4658646aa8eee0119ab"} Dec 01 10:34:03 crc kubenswrapper[4761]: E1201 10:34:03.013717 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5c886" podUID="5821e59d-de93-43fd-822d-83128ce780de" Dec 01 10:34:03 crc kubenswrapper[4761]: E1201 10:34:03.013755 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d99sk" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.850709 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:34:03 crc kubenswrapper[4761]: I1201 10:34:03.851024 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.020240 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-86rp7" event={"ID":"65d0c868-c268-4723-9323-6937c06b4ea9","Type":"ContainerStarted","Data":"4d7b1256871aadadfe5d0c955dab535d54076ae28418e16d552070adf6202a0a"} Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.020625 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-86rp7" event={"ID":"65d0c868-c268-4723-9323-6937c06b4ea9","Type":"ContainerStarted","Data":"6a8401f3c9e460628010465cd8ecbf9bab74d8ae38b6868493f0e6fc446861dd"} Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.022561 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh6dn" event={"ID":"a570f753-345e-40b8-a088-2d28ecf41896","Type":"ContainerStarted","Data":"1e6aa2141cded851c99e18b84f5ee8f00bcf7bd71f70ad4e29b29ecfa8d42b84"} Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.024958 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nztd7" event={"ID":"3fe88ace-f487-4b05-a9de-d5bdd2945c75","Type":"ContainerStarted","Data":"bc6e4a41ee9fa99f3cb30dbae761dbe61705bc0d840abe2adc038c1ed1fb799c"} Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.026938 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvkf2" event={"ID":"0e51c452-5010-4af5-bb69-941565926337","Type":"ContainerStarted","Data":"01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984"} Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.039392 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-86rp7" podStartSLOduration=163.039375442 podStartE2EDuration="2m43.039375442s" podCreationTimestamp="2025-12-01 10:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:04.035622551 +0000 UTC m=+183.339381175" watchObservedRunningTime="2025-12-01 10:34:04.039375442 +0000 UTC m=+183.343134056" Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.085344 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fvkf2" podStartSLOduration=3.237939447 podStartE2EDuration="37.085323936s" podCreationTimestamp="2025-12-01 10:33:27 +0000 UTC" firstStartedPulling="2025-12-01 10:33:29.613771161 +0000 UTC m=+148.917529785" lastFinishedPulling="2025-12-01 10:34:03.46115564 +0000 UTC m=+182.764914274" observedRunningTime="2025-12-01 10:34:04.068392703 +0000 UTC m=+183.372151327" watchObservedRunningTime="2025-12-01 10:34:04.085323936 +0000 UTC m=+183.389082560" Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.085630 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qh6dn" podStartSLOduration=3.995019897 podStartE2EDuration="39.085624955s" podCreationTimestamp="2025-12-01 10:33:25 +0000 UTC" firstStartedPulling="2025-12-01 10:33:28.534097146 +0000 UTC m=+147.837855770" lastFinishedPulling="2025-12-01 10:34:03.624702194 +0000 UTC m=+182.928460828" observedRunningTime="2025-12-01 10:34:04.085082709 +0000 UTC m=+183.388841333" watchObservedRunningTime="2025-12-01 10:34:04.085624955 +0000 UTC m=+183.389383579" Dec 01 10:34:04 crc kubenswrapper[4761]: I1201 10:34:04.104002 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nztd7" podStartSLOduration=2.923335209 podStartE2EDuration="39.10398385s" podCreationTimestamp="2025-12-01 10:33:25 +0000 UTC" firstStartedPulling="2025-12-01 10:33:27.385479305 +0000 UTC m=+146.689237929" lastFinishedPulling="2025-12-01 10:34:03.566127946 +0000 UTC m=+182.869886570" observedRunningTime="2025-12-01 10:34:04.100356672 +0000 UTC m=+183.404115326" watchObservedRunningTime="2025-12-01 10:34:04.10398385 +0000 UTC m=+183.407742474" Dec 01 10:34:05 crc kubenswrapper[4761]: I1201 10:34:05.852917 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:34:05 crc kubenswrapper[4761]: I1201 10:34:05.853115 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:34:05 crc kubenswrapper[4761]: I1201 10:34:05.917384 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.275210 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.275247 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.318079 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.370899 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:34:06 crc kubenswrapper[4761]: E1201 10:34:06.371121 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28dc4b8d-a15b-4ec3-9dc2-3def59debaa0" containerName="pruner" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.371137 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="28dc4b8d-a15b-4ec3-9dc2-3def59debaa0" containerName="pruner" Dec 01 10:34:06 crc kubenswrapper[4761]: E1201 10:34:06.371155 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe15d94-ad40-45d1-938a-c5f67f74ce8e" containerName="pruner" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.371162 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe15d94-ad40-45d1-938a-c5f67f74ce8e" containerName="pruner" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.371249 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe15d94-ad40-45d1-938a-c5f67f74ce8e" containerName="pruner" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.371259 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="28dc4b8d-a15b-4ec3-9dc2-3def59debaa0" containerName="pruner" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.371599 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.374278 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.374896 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.396850 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.498879 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39ef76a3-50b2-4062-9602-dcb4768bd13d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"39ef76a3-50b2-4062-9602-dcb4768bd13d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.498987 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39ef76a3-50b2-4062-9602-dcb4768bd13d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"39ef76a3-50b2-4062-9602-dcb4768bd13d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.600218 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39ef76a3-50b2-4062-9602-dcb4768bd13d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"39ef76a3-50b2-4062-9602-dcb4768bd13d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.600312 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39ef76a3-50b2-4062-9602-dcb4768bd13d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"39ef76a3-50b2-4062-9602-dcb4768bd13d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.600409 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39ef76a3-50b2-4062-9602-dcb4768bd13d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"39ef76a3-50b2-4062-9602-dcb4768bd13d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.623630 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39ef76a3-50b2-4062-9602-dcb4768bd13d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"39ef76a3-50b2-4062-9602-dcb4768bd13d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.704113 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:06 crc kubenswrapper[4761]: I1201 10:34:06.912329 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:34:07 crc kubenswrapper[4761]: I1201 10:34:07.042390 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"39ef76a3-50b2-4062-9602-dcb4768bd13d","Type":"ContainerStarted","Data":"65e4a2cabef29c0c6e9ec81997cb0e964897fb5d86e05372d03cfeb8f3c01f85"} Dec 01 10:34:07 crc kubenswrapper[4761]: I1201 10:34:07.878680 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:34:07 crc kubenswrapper[4761]: I1201 10:34:07.878749 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:34:07 crc kubenswrapper[4761]: I1201 10:34:07.929303 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:34:08 crc kubenswrapper[4761]: I1201 10:34:08.051015 4761 generic.go:334] "Generic (PLEG): container finished" podID="39ef76a3-50b2-4062-9602-dcb4768bd13d" containerID="830809df4c33e53fc2828becbcfccad830da8ca19338ff7562399aefca7be3d1" exitCode=0 Dec 01 10:34:08 crc kubenswrapper[4761]: I1201 10:34:08.052510 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"39ef76a3-50b2-4062-9602-dcb4768bd13d","Type":"ContainerDied","Data":"830809df4c33e53fc2828becbcfccad830da8ca19338ff7562399aefca7be3d1"} Dec 01 10:34:08 crc kubenswrapper[4761]: I1201 10:34:08.107258 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:34:08 crc kubenswrapper[4761]: I1201 10:34:08.344097 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:34:08 crc kubenswrapper[4761]: I1201 10:34:08.663876 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82k6m"] Dec 01 10:34:09 crc kubenswrapper[4761]: I1201 10:34:09.274325 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:09 crc kubenswrapper[4761]: I1201 10:34:09.344943 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39ef76a3-50b2-4062-9602-dcb4768bd13d-kube-api-access\") pod \"39ef76a3-50b2-4062-9602-dcb4768bd13d\" (UID: \"39ef76a3-50b2-4062-9602-dcb4768bd13d\") " Dec 01 10:34:09 crc kubenswrapper[4761]: I1201 10:34:09.345263 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39ef76a3-50b2-4062-9602-dcb4768bd13d-kubelet-dir\") pod \"39ef76a3-50b2-4062-9602-dcb4768bd13d\" (UID: \"39ef76a3-50b2-4062-9602-dcb4768bd13d\") " Dec 01 10:34:09 crc kubenswrapper[4761]: I1201 10:34:09.345372 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39ef76a3-50b2-4062-9602-dcb4768bd13d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "39ef76a3-50b2-4062-9602-dcb4768bd13d" (UID: "39ef76a3-50b2-4062-9602-dcb4768bd13d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:09 crc kubenswrapper[4761]: I1201 10:34:09.345603 4761 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39ef76a3-50b2-4062-9602-dcb4768bd13d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:09 crc kubenswrapper[4761]: I1201 10:34:09.349875 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ef76a3-50b2-4062-9602-dcb4768bd13d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "39ef76a3-50b2-4062-9602-dcb4768bd13d" (UID: "39ef76a3-50b2-4062-9602-dcb4768bd13d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:09 crc kubenswrapper[4761]: I1201 10:34:09.447197 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39ef76a3-50b2-4062-9602-dcb4768bd13d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:10 crc kubenswrapper[4761]: I1201 10:34:10.076886 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"39ef76a3-50b2-4062-9602-dcb4768bd13d","Type":"ContainerDied","Data":"65e4a2cabef29c0c6e9ec81997cb0e964897fb5d86e05372d03cfeb8f3c01f85"} Dec 01 10:34:10 crc kubenswrapper[4761]: I1201 10:34:10.076939 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65e4a2cabef29c0c6e9ec81997cb0e964897fb5d86e05372d03cfeb8f3c01f85" Dec 01 10:34:10 crc kubenswrapper[4761]: I1201 10:34:10.077019 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.972117 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:34:12 crc kubenswrapper[4761]: E1201 10:34:12.972951 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ef76a3-50b2-4062-9602-dcb4768bd13d" containerName="pruner" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.972965 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ef76a3-50b2-4062-9602-dcb4768bd13d" containerName="pruner" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.973078 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ef76a3-50b2-4062-9602-dcb4768bd13d" containerName="pruner" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.973430 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.981087 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.983355 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.983605 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.984983 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57880e23-a871-459f-8c11-2d59e61e2eaf-kube-api-access\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.985041 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-kubelet-dir\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:12 crc kubenswrapper[4761]: I1201 10:34:12.985080 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-var-lock\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.086298 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-kubelet-dir\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.086413 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-var-lock\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.086427 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-kubelet-dir\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.086475 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-var-lock\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.086502 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57880e23-a871-459f-8c11-2d59e61e2eaf-kube-api-access\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.096501 4761 generic.go:334] "Generic (PLEG): container finished" podID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerID="0212e394b3dd72631bc22455f87dabe8320ab9b69595cb2f72c4fbcce4b44ae3" exitCode=0 Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.096565 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wblg" event={"ID":"6ba40ceb-381a-41e3-8d11-ed171d07ee74","Type":"ContainerDied","Data":"0212e394b3dd72631bc22455f87dabe8320ab9b69595cb2f72c4fbcce4b44ae3"} Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.108329 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57880e23-a871-459f-8c11-2d59e61e2eaf-kube-api-access\") pod \"installer-9-crc\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.303647 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:13 crc kubenswrapper[4761]: I1201 10:34:13.524663 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:34:13 crc kubenswrapper[4761]: W1201 10:34:13.534837 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod57880e23_a871_459f_8c11_2d59e61e2eaf.slice/crio-5efc0e400b0dcc59e524d11fe1054e6ec524bbb29eedef43ab7818265f4f02d9 WatchSource:0}: Error finding container 5efc0e400b0dcc59e524d11fe1054e6ec524bbb29eedef43ab7818265f4f02d9: Status 404 returned error can't find the container with id 5efc0e400b0dcc59e524d11fe1054e6ec524bbb29eedef43ab7818265f4f02d9 Dec 01 10:34:14 crc kubenswrapper[4761]: I1201 10:34:14.104996 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"57880e23-a871-459f-8c11-2d59e61e2eaf","Type":"ContainerStarted","Data":"df4ce2c3ffadc7b6df474b8eac51389ad05aeb25f7eb9b5d2e6161c2a93319ff"} Dec 01 10:34:14 crc kubenswrapper[4761]: I1201 10:34:14.105330 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"57880e23-a871-459f-8c11-2d59e61e2eaf","Type":"ContainerStarted","Data":"5efc0e400b0dcc59e524d11fe1054e6ec524bbb29eedef43ab7818265f4f02d9"} Dec 01 10:34:14 crc kubenswrapper[4761]: I1201 10:34:14.108141 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wblg" event={"ID":"6ba40ceb-381a-41e3-8d11-ed171d07ee74","Type":"ContainerStarted","Data":"e40b73fa6fa1ac4b4bc6c9fe0bccd417823834ada5a31d09376491b8f61b7ded"} Dec 01 10:34:14 crc kubenswrapper[4761]: I1201 10:34:14.125209 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2wblg" podStartSLOduration=3.225565558 podStartE2EDuration="46.125189596s" podCreationTimestamp="2025-12-01 10:33:28 +0000 UTC" firstStartedPulling="2025-12-01 10:33:30.654519049 +0000 UTC m=+149.958277673" lastFinishedPulling="2025-12-01 10:34:13.554143087 +0000 UTC m=+192.857901711" observedRunningTime="2025-12-01 10:34:14.122276029 +0000 UTC m=+193.426034673" watchObservedRunningTime="2025-12-01 10:34:14.125189596 +0000 UTC m=+193.428948220" Dec 01 10:34:15 crc kubenswrapper[4761]: I1201 10:34:15.135865 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.1358467819999998 podStartE2EDuration="3.135846782s" podCreationTimestamp="2025-12-01 10:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:15.133555994 +0000 UTC m=+194.437314618" watchObservedRunningTime="2025-12-01 10:34:15.135846782 +0000 UTC m=+194.439605406" Dec 01 10:34:15 crc kubenswrapper[4761]: I1201 10:34:15.901298 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:34:16 crc kubenswrapper[4761]: I1201 10:34:16.310425 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:34:18 crc kubenswrapper[4761]: I1201 10:34:18.527002 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qh6dn"] Dec 01 10:34:18 crc kubenswrapper[4761]: I1201 10:34:18.527388 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qh6dn" podUID="a570f753-345e-40b8-a088-2d28ecf41896" containerName="registry-server" containerID="cri-o://1e6aa2141cded851c99e18b84f5ee8f00bcf7bd71f70ad4e29b29ecfa8d42b84" gracePeriod=2 Dec 01 10:34:19 crc kubenswrapper[4761]: I1201 10:34:19.216508 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:34:19 crc kubenswrapper[4761]: I1201 10:34:19.216901 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:34:19 crc kubenswrapper[4761]: I1201 10:34:19.262519 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:34:20 crc kubenswrapper[4761]: I1201 10:34:20.164709 4761 generic.go:334] "Generic (PLEG): container finished" podID="a570f753-345e-40b8-a088-2d28ecf41896" containerID="1e6aa2141cded851c99e18b84f5ee8f00bcf7bd71f70ad4e29b29ecfa8d42b84" exitCode=0 Dec 01 10:34:20 crc kubenswrapper[4761]: I1201 10:34:20.164810 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh6dn" event={"ID":"a570f753-345e-40b8-a088-2d28ecf41896","Type":"ContainerDied","Data":"1e6aa2141cded851c99e18b84f5ee8f00bcf7bd71f70ad4e29b29ecfa8d42b84"} Dec 01 10:34:20 crc kubenswrapper[4761]: I1201 10:34:20.205918 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.606542 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.704695 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-catalog-content\") pod \"a570f753-345e-40b8-a088-2d28ecf41896\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.704772 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2jl5\" (UniqueName: \"kubernetes.io/projected/a570f753-345e-40b8-a088-2d28ecf41896-kube-api-access-v2jl5\") pod \"a570f753-345e-40b8-a088-2d28ecf41896\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.704819 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-utilities\") pod \"a570f753-345e-40b8-a088-2d28ecf41896\" (UID: \"a570f753-345e-40b8-a088-2d28ecf41896\") " Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.706097 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-utilities" (OuterVolumeSpecName: "utilities") pod "a570f753-345e-40b8-a088-2d28ecf41896" (UID: "a570f753-345e-40b8-a088-2d28ecf41896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.711092 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a570f753-345e-40b8-a088-2d28ecf41896-kube-api-access-v2jl5" (OuterVolumeSpecName: "kube-api-access-v2jl5") pod "a570f753-345e-40b8-a088-2d28ecf41896" (UID: "a570f753-345e-40b8-a088-2d28ecf41896"). InnerVolumeSpecName "kube-api-access-v2jl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.754710 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a570f753-345e-40b8-a088-2d28ecf41896" (UID: "a570f753-345e-40b8-a088-2d28ecf41896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.806667 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.806880 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2jl5\" (UniqueName: \"kubernetes.io/projected/a570f753-345e-40b8-a088-2d28ecf41896-kube-api-access-v2jl5\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:21 crc kubenswrapper[4761]: I1201 10:34:21.806946 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a570f753-345e-40b8-a088-2d28ecf41896-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.175698 4761 generic.go:334] "Generic (PLEG): container finished" podID="5821e59d-de93-43fd-822d-83128ce780de" containerID="aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.175776 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5c886" event={"ID":"5821e59d-de93-43fd-822d-83128ce780de","Type":"ContainerDied","Data":"aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1"} Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.178873 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qh6dn" Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.178931 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qh6dn" event={"ID":"a570f753-345e-40b8-a088-2d28ecf41896","Type":"ContainerDied","Data":"e00084b4e8903b70791b0911590c919d23818d52b46b4c07c64bf62ba4675816"} Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.178983 4761 scope.go:117] "RemoveContainer" containerID="1e6aa2141cded851c99e18b84f5ee8f00bcf7bd71f70ad4e29b29ecfa8d42b84" Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.189058 4761 generic.go:334] "Generic (PLEG): container finished" podID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerID="c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.189117 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55clp" event={"ID":"6500351a-78de-4cb9-bc74-12a450bbc76e","Type":"ContainerDied","Data":"c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc"} Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.198333 4761 generic.go:334] "Generic (PLEG): container finished" podID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerID="0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.198454 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktzw" event={"ID":"95b42196-1572-4e3f-b807-4ef64ed3311f","Type":"ContainerDied","Data":"0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2"} Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.205524 4761 generic.go:334] "Generic (PLEG): container finished" podID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerID="409b1b1f9063bd31e3060195fe09e467c62898ce54b54b239a459b30bde9e8fd" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.205607 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d99sk" event={"ID":"1e69dab2-4c11-4352-95c8-92499a4c5a75","Type":"ContainerDied","Data":"409b1b1f9063bd31e3060195fe09e467c62898ce54b54b239a459b30bde9e8fd"} Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.218422 4761 scope.go:117] "RemoveContainer" containerID="4dabd376668d1e1e57ec339e216278ab32bcc7f0de5ac4658646aa8eee0119ab" Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.289516 4761 scope.go:117] "RemoveContainer" containerID="9181ab0230eb069f1ac6797499b08b1f03ac9ff586e2363655797276331fd723" Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.290065 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qh6dn"] Dec 01 10:34:22 crc kubenswrapper[4761]: I1201 10:34:22.294408 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qh6dn"] Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.135300 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a570f753-345e-40b8-a088-2d28ecf41896" path="/var/lib/kubelet/pods/a570f753-345e-40b8-a088-2d28ecf41896/volumes" Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.211396 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktzw" event={"ID":"95b42196-1572-4e3f-b807-4ef64ed3311f","Type":"ContainerStarted","Data":"3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399"} Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.215635 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d99sk" event={"ID":"1e69dab2-4c11-4352-95c8-92499a4c5a75","Type":"ContainerStarted","Data":"c37465957bcbbab7f933eeb3b4f186c26f6625eb3b4e196ebf0a05d05579ca70"} Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.218265 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5c886" event={"ID":"5821e59d-de93-43fd-822d-83128ce780de","Type":"ContainerStarted","Data":"da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787"} Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.220788 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55clp" event={"ID":"6500351a-78de-4cb9-bc74-12a450bbc76e","Type":"ContainerStarted","Data":"c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd"} Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.238189 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nktzw" podStartSLOduration=3.107155881 podStartE2EDuration="56.238167432s" podCreationTimestamp="2025-12-01 10:33:27 +0000 UTC" firstStartedPulling="2025-12-01 10:33:29.572917538 +0000 UTC m=+148.876676162" lastFinishedPulling="2025-12-01 10:34:22.703929089 +0000 UTC m=+202.007687713" observedRunningTime="2025-12-01 10:34:23.235678156 +0000 UTC m=+202.539436800" watchObservedRunningTime="2025-12-01 10:34:23.238167432 +0000 UTC m=+202.541926056" Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.273066 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5c886" podStartSLOduration=3.029802062 podStartE2EDuration="58.273052638s" podCreationTimestamp="2025-12-01 10:33:25 +0000 UTC" firstStartedPulling="2025-12-01 10:33:27.479025762 +0000 UTC m=+146.782784386" lastFinishedPulling="2025-12-01 10:34:22.722276338 +0000 UTC m=+202.026034962" observedRunningTime="2025-12-01 10:34:23.25617316 +0000 UTC m=+202.559931774" watchObservedRunningTime="2025-12-01 10:34:23.273052638 +0000 UTC m=+202.576811262" Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.273954 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d99sk" podStartSLOduration=2.911304216 podStartE2EDuration="58.273947029s" podCreationTimestamp="2025-12-01 10:33:25 +0000 UTC" firstStartedPulling="2025-12-01 10:33:27.451296139 +0000 UTC m=+146.755054763" lastFinishedPulling="2025-12-01 10:34:22.813938962 +0000 UTC m=+202.117697576" observedRunningTime="2025-12-01 10:34:23.270935114 +0000 UTC m=+202.574693738" watchObservedRunningTime="2025-12-01 10:34:23.273947029 +0000 UTC m=+202.577705653" Dec 01 10:34:23 crc kubenswrapper[4761]: I1201 10:34:23.286511 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-55clp" podStartSLOduration=2.144009062 podStartE2EDuration="55.286496436s" podCreationTimestamp="2025-12-01 10:33:28 +0000 UTC" firstStartedPulling="2025-12-01 10:33:29.594913831 +0000 UTC m=+148.898672455" lastFinishedPulling="2025-12-01 10:34:22.737401145 +0000 UTC m=+202.041159829" observedRunningTime="2025-12-01 10:34:23.285211931 +0000 UTC m=+202.588970555" watchObservedRunningTime="2025-12-01 10:34:23.286496436 +0000 UTC m=+202.590255060" Dec 01 10:34:24 crc kubenswrapper[4761]: I1201 10:34:24.051585 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpsf2"] Dec 01 10:34:24 crc kubenswrapper[4761]: I1201 10:34:24.051792 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" podUID="983972ee-4dc5-4a52-9087-d69d4362b33d" containerName="controller-manager" containerID="cri-o://435a9e4af327ad4f13b8f52357e4af2165758f8c91ccec01cd70165e692ee672" gracePeriod=30 Dec 01 10:34:24 crc kubenswrapper[4761]: I1201 10:34:24.159428 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl"] Dec 01 10:34:24 crc kubenswrapper[4761]: I1201 10:34:24.172983 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" podUID="33f5c4fc-08a4-4683-ab53-e20612b27d02" containerName="route-controller-manager" containerID="cri-o://af537995f5c7635f7dc3bb4959993d721813c4ebd32be939fb412f5f958aa966" gracePeriod=30 Dec 01 10:34:24 crc kubenswrapper[4761]: I1201 10:34:24.324364 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2wblg"] Dec 01 10:34:24 crc kubenswrapper[4761]: I1201 10:34:24.324632 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2wblg" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerName="registry-server" containerID="cri-o://e40b73fa6fa1ac4b4bc6c9fe0bccd417823834ada5a31d09376491b8f61b7ded" gracePeriod=2 Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.248445 4761 generic.go:334] "Generic (PLEG): container finished" podID="983972ee-4dc5-4a52-9087-d69d4362b33d" containerID="435a9e4af327ad4f13b8f52357e4af2165758f8c91ccec01cd70165e692ee672" exitCode=0 Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.248507 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" event={"ID":"983972ee-4dc5-4a52-9087-d69d4362b33d","Type":"ContainerDied","Data":"435a9e4af327ad4f13b8f52357e4af2165758f8c91ccec01cd70165e692ee672"} Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.249615 4761 generic.go:334] "Generic (PLEG): container finished" podID="33f5c4fc-08a4-4683-ab53-e20612b27d02" containerID="af537995f5c7635f7dc3bb4959993d721813c4ebd32be939fb412f5f958aa966" exitCode=0 Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.249647 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" event={"ID":"33f5c4fc-08a4-4683-ab53-e20612b27d02","Type":"ContainerDied","Data":"af537995f5c7635f7dc3bb4959993d721813c4ebd32be939fb412f5f958aa966"} Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.251502 4761 generic.go:334] "Generic (PLEG): container finished" podID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerID="e40b73fa6fa1ac4b4bc6c9fe0bccd417823834ada5a31d09376491b8f61b7ded" exitCode=0 Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.251525 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wblg" event={"ID":"6ba40ceb-381a-41e3-8d11-ed171d07ee74","Type":"ContainerDied","Data":"e40b73fa6fa1ac4b4bc6c9fe0bccd417823834ada5a31d09376491b8f61b7ded"} Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.549875 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.559115 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.600403 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.603175 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.653270 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.658087 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-utilities\") pod \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.658132 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-config\") pod \"983972ee-4dc5-4a52-9087-d69d4362b33d\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.658192 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/983972ee-4dc5-4a52-9087-d69d4362b33d-serving-cert\") pod \"983972ee-4dc5-4a52-9087-d69d4362b33d\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.658230 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc9jr\" (UniqueName: \"kubernetes.io/projected/6ba40ceb-381a-41e3-8d11-ed171d07ee74-kube-api-access-hc9jr\") pod \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.658265 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-proxy-ca-bundles\") pod \"983972ee-4dc5-4a52-9087-d69d4362b33d\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.658301 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-client-ca\") pod \"983972ee-4dc5-4a52-9087-d69d4362b33d\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.658326 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lwfm\" (UniqueName: \"kubernetes.io/projected/983972ee-4dc5-4a52-9087-d69d4362b33d-kube-api-access-6lwfm\") pod \"983972ee-4dc5-4a52-9087-d69d4362b33d\" (UID: \"983972ee-4dc5-4a52-9087-d69d4362b33d\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.658367 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-catalog-content\") pod \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\" (UID: \"6ba40ceb-381a-41e3-8d11-ed171d07ee74\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.659818 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-config" (OuterVolumeSpecName: "config") pod "983972ee-4dc5-4a52-9087-d69d4362b33d" (UID: "983972ee-4dc5-4a52-9087-d69d4362b33d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.661746 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-utilities" (OuterVolumeSpecName: "utilities") pod "6ba40ceb-381a-41e3-8d11-ed171d07ee74" (UID: "6ba40ceb-381a-41e3-8d11-ed171d07ee74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.662092 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-client-ca" (OuterVolumeSpecName: "client-ca") pod "983972ee-4dc5-4a52-9087-d69d4362b33d" (UID: "983972ee-4dc5-4a52-9087-d69d4362b33d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.662858 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "983972ee-4dc5-4a52-9087-d69d4362b33d" (UID: "983972ee-4dc5-4a52-9087-d69d4362b33d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.665235 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/983972ee-4dc5-4a52-9087-d69d4362b33d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "983972ee-4dc5-4a52-9087-d69d4362b33d" (UID: "983972ee-4dc5-4a52-9087-d69d4362b33d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.665676 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/983972ee-4dc5-4a52-9087-d69d4362b33d-kube-api-access-6lwfm" (OuterVolumeSpecName: "kube-api-access-6lwfm") pod "983972ee-4dc5-4a52-9087-d69d4362b33d" (UID: "983972ee-4dc5-4a52-9087-d69d4362b33d"). InnerVolumeSpecName "kube-api-access-6lwfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.666691 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba40ceb-381a-41e3-8d11-ed171d07ee74-kube-api-access-hc9jr" (OuterVolumeSpecName: "kube-api-access-hc9jr") pod "6ba40ceb-381a-41e3-8d11-ed171d07ee74" (UID: "6ba40ceb-381a-41e3-8d11-ed171d07ee74"). InnerVolumeSpecName "kube-api-access-hc9jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.674203 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.759927 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-client-ca\") pod \"33f5c4fc-08a4-4683-ab53-e20612b27d02\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.759992 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7wbb\" (UniqueName: \"kubernetes.io/projected/33f5c4fc-08a4-4683-ab53-e20612b27d02-kube-api-access-g7wbb\") pod \"33f5c4fc-08a4-4683-ab53-e20612b27d02\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760074 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f5c4fc-08a4-4683-ab53-e20612b27d02-serving-cert\") pod \"33f5c4fc-08a4-4683-ab53-e20612b27d02\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760109 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-config\") pod \"33f5c4fc-08a4-4683-ab53-e20612b27d02\" (UID: \"33f5c4fc-08a4-4683-ab53-e20612b27d02\") " Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760303 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760314 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760322 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/983972ee-4dc5-4a52-9087-d69d4362b33d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760331 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc9jr\" (UniqueName: \"kubernetes.io/projected/6ba40ceb-381a-41e3-8d11-ed171d07ee74-kube-api-access-hc9jr\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760339 4761 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760346 4761 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/983972ee-4dc5-4a52-9087-d69d4362b33d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760354 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lwfm\" (UniqueName: \"kubernetes.io/projected/983972ee-4dc5-4a52-9087-d69d4362b33d-kube-api-access-6lwfm\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.760963 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-client-ca" (OuterVolumeSpecName: "client-ca") pod "33f5c4fc-08a4-4683-ab53-e20612b27d02" (UID: "33f5c4fc-08a4-4683-ab53-e20612b27d02"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.761045 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-config" (OuterVolumeSpecName: "config") pod "33f5c4fc-08a4-4683-ab53-e20612b27d02" (UID: "33f5c4fc-08a4-4683-ab53-e20612b27d02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.764863 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f5c4fc-08a4-4683-ab53-e20612b27d02-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "33f5c4fc-08a4-4683-ab53-e20612b27d02" (UID: "33f5c4fc-08a4-4683-ab53-e20612b27d02"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.764898 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f5c4fc-08a4-4683-ab53-e20612b27d02-kube-api-access-g7wbb" (OuterVolumeSpecName: "kube-api-access-g7wbb") pod "33f5c4fc-08a4-4683-ab53-e20612b27d02" (UID: "33f5c4fc-08a4-4683-ab53-e20612b27d02"). InnerVolumeSpecName "kube-api-access-g7wbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.780770 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ba40ceb-381a-41e3-8d11-ed171d07ee74" (UID: "6ba40ceb-381a-41e3-8d11-ed171d07ee74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.861632 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f5c4fc-08a4-4683-ab53-e20612b27d02-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.861873 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.861975 4761 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f5c4fc-08a4-4683-ab53-e20612b27d02-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.862040 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7wbb\" (UniqueName: \"kubernetes.io/projected/33f5c4fc-08a4-4683-ab53-e20612b27d02-kube-api-access-g7wbb\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.862109 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba40ceb-381a-41e3-8d11-ed171d07ee74-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.995898 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5c886" Dec 01 10:34:25 crc kubenswrapper[4761]: I1201 10:34:25.996370 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5c886" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.033872 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5c886" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.260934 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" event={"ID":"983972ee-4dc5-4a52-9087-d69d4362b33d","Type":"ContainerDied","Data":"5b2e2295d347096af2be3e4789f6ba678118a1dfe8d2e1a7a2de336528385972"} Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.260994 4761 scope.go:117] "RemoveContainer" containerID="435a9e4af327ad4f13b8f52357e4af2165758f8c91ccec01cd70165e692ee672" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.260995 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vpsf2" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.263735 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.263758 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl" event={"ID":"33f5c4fc-08a4-4683-ab53-e20612b27d02","Type":"ContainerDied","Data":"6b84446ab1023db3abc8d774bf88d7aedff5d159473099ed48acbdd31d142deb"} Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.273509 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wblg" event={"ID":"6ba40ceb-381a-41e3-8d11-ed171d07ee74","Type":"ContainerDied","Data":"492bd18f07d1464448de0c57f56b3b5c8e0fea7676ae9cc3be1a370ec35e3b50"} Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.273610 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wblg" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.284457 4761 scope.go:117] "RemoveContainer" containerID="af537995f5c7635f7dc3bb4959993d721813c4ebd32be939fb412f5f958aa966" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.297687 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl"] Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.300914 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bxmxl"] Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.309164 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpsf2"] Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.309426 4761 scope.go:117] "RemoveContainer" containerID="e40b73fa6fa1ac4b4bc6c9fe0bccd417823834ada5a31d09376491b8f61b7ded" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.311861 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vpsf2"] Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.321661 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2wblg"] Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.324990 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2wblg"] Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.332633 4761 scope.go:117] "RemoveContainer" containerID="0212e394b3dd72631bc22455f87dabe8320ab9b69595cb2f72c4fbcce4b44ae3" Dec 01 10:34:26 crc kubenswrapper[4761]: I1201 10:34:26.345740 4761 scope.go:117] "RemoveContainer" containerID="bac0e110bd962c1c9c78530027d0b0c307456868e8728559ea9fd8c02a00a78f" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.139408 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f5c4fc-08a4-4683-ab53-e20612b27d02" path="/var/lib/kubelet/pods/33f5c4fc-08a4-4683-ab53-e20612b27d02/volumes" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.140876 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" path="/var/lib/kubelet/pods/6ba40ceb-381a-41e3-8d11-ed171d07ee74/volumes" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.142205 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="983972ee-4dc5-4a52-9087-d69d4362b33d" path="/var/lib/kubelet/pods/983972ee-4dc5-4a52-9087-d69d4362b33d/volumes" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.356481 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.363107 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5c886" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.972474 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7599fffbbc-m2wfl"] Dec 01 10:34:27 crc kubenswrapper[4761]: E1201 10:34:27.972886 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983972ee-4dc5-4a52-9087-d69d4362b33d" containerName="controller-manager" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.972959 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="983972ee-4dc5-4a52-9087-d69d4362b33d" containerName="controller-manager" Dec 01 10:34:27 crc kubenswrapper[4761]: E1201 10:34:27.973023 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerName="registry-server" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.973089 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerName="registry-server" Dec 01 10:34:27 crc kubenswrapper[4761]: E1201 10:34:27.973158 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a570f753-345e-40b8-a088-2d28ecf41896" containerName="extract-utilities" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.973214 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="a570f753-345e-40b8-a088-2d28ecf41896" containerName="extract-utilities" Dec 01 10:34:27 crc kubenswrapper[4761]: E1201 10:34:27.973272 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f5c4fc-08a4-4683-ab53-e20612b27d02" containerName="route-controller-manager" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.973332 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f5c4fc-08a4-4683-ab53-e20612b27d02" containerName="route-controller-manager" Dec 01 10:34:27 crc kubenswrapper[4761]: E1201 10:34:27.973390 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerName="extract-content" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.973444 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerName="extract-content" Dec 01 10:34:27 crc kubenswrapper[4761]: E1201 10:34:27.973511 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerName="extract-utilities" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.973590 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerName="extract-utilities" Dec 01 10:34:27 crc kubenswrapper[4761]: E1201 10:34:27.973653 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a570f753-345e-40b8-a088-2d28ecf41896" containerName="registry-server" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.973708 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="a570f753-345e-40b8-a088-2d28ecf41896" containerName="registry-server" Dec 01 10:34:27 crc kubenswrapper[4761]: E1201 10:34:27.973764 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a570f753-345e-40b8-a088-2d28ecf41896" containerName="extract-content" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.973831 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="a570f753-345e-40b8-a088-2d28ecf41896" containerName="extract-content" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.974006 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f5c4fc-08a4-4683-ab53-e20612b27d02" containerName="route-controller-manager" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.974073 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba40ceb-381a-41e3-8d11-ed171d07ee74" containerName="registry-server" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.974131 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="a570f753-345e-40b8-a088-2d28ecf41896" containerName="registry-server" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.974188 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="983972ee-4dc5-4a52-9087-d69d4362b33d" containerName="controller-manager" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.974669 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.974992 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm"] Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.975855 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.977531 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.978095 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.978255 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.978339 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.978531 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.978704 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.978829 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.978999 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.979980 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.980158 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.980607 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.980608 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.985473 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:34:27 crc kubenswrapper[4761]: I1201 10:34:27.991014 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7599fffbbc-m2wfl"] Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.000226 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm"] Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102192 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-client-ca\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102242 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-proxy-ca-bundles\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102276 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pss7q\" (UniqueName: \"kubernetes.io/projected/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-kube-api-access-pss7q\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102302 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnf8r\" (UniqueName: \"kubernetes.io/projected/197f7c4e-f0a3-4f54-95ca-d069391728bf-kube-api-access-cnf8r\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102399 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-config\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102455 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-serving-cert\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102598 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-config\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102661 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-client-ca\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.102715 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197f7c4e-f0a3-4f54-95ca-d069391728bf-serving-cert\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203616 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-config\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203715 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-client-ca\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203771 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197f7c4e-f0a3-4f54-95ca-d069391728bf-serving-cert\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203805 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-client-ca\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203826 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-proxy-ca-bundles\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203849 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnf8r\" (UniqueName: \"kubernetes.io/projected/197f7c4e-f0a3-4f54-95ca-d069391728bf-kube-api-access-cnf8r\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203869 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pss7q\" (UniqueName: \"kubernetes.io/projected/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-kube-api-access-pss7q\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203908 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-config\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.203944 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-serving-cert\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.206061 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-config\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.206102 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-client-ca\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.206173 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-client-ca\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.206400 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-proxy-ca-bundles\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.206450 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-config\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.211360 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-serving-cert\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.211374 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197f7c4e-f0a3-4f54-95ca-d069391728bf-serving-cert\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.223064 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pss7q\" (UniqueName: \"kubernetes.io/projected/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-kube-api-access-pss7q\") pod \"controller-manager-7599fffbbc-m2wfl\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.225748 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnf8r\" (UniqueName: \"kubernetes.io/projected/197f7c4e-f0a3-4f54-95ca-d069391728bf-kube-api-access-cnf8r\") pod \"route-controller-manager-7786c744cc-p2gsm\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.269724 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.269784 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.299369 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.308071 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.310306 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.376357 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.498572 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7599fffbbc-m2wfl"] Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.530419 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm"] Dec 01 10:34:28 crc kubenswrapper[4761]: W1201 10:34:28.533698 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod197f7c4e_f0a3_4f54_95ca_d069391728bf.slice/crio-7191dbc3d405384a5edb42f4912e64092429ddcc2d749b93e4ef6af041d30936 WatchSource:0}: Error finding container 7191dbc3d405384a5edb42f4912e64092429ddcc2d749b93e4ef6af041d30936: Status 404 returned error can't find the container with id 7191dbc3d405384a5edb42f4912e64092429ddcc2d749b93e4ef6af041d30936 Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.740588 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.740641 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.788398 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:34:28 crc kubenswrapper[4761]: I1201 10:34:28.930724 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktzw"] Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.301244 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" event={"ID":"1aa17ca4-d38b-42ac-9692-674c0f9d7c37","Type":"ContainerStarted","Data":"34daeaf6d9c4e22aacbeb16f561b9abc25c1e0bea7efac6e282e5ce3b1e37fba"} Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.301304 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" event={"ID":"1aa17ca4-d38b-42ac-9692-674c0f9d7c37","Type":"ContainerStarted","Data":"69b6a40719b990d8e9ff95e07520b183377cc59aa1126860e13fc14549979720"} Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.301483 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.303351 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" event={"ID":"197f7c4e-f0a3-4f54-95ca-d069391728bf","Type":"ContainerStarted","Data":"ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527"} Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.303430 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" event={"ID":"197f7c4e-f0a3-4f54-95ca-d069391728bf","Type":"ContainerStarted","Data":"7191dbc3d405384a5edb42f4912e64092429ddcc2d749b93e4ef6af041d30936"} Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.304029 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.306036 4761 patch_prober.go:28] interesting pod/controller-manager-7599fffbbc-m2wfl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.306082 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" podUID="1aa17ca4-d38b-42ac-9692-674c0f9d7c37" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.306742 4761 patch_prober.go:28] interesting pod/route-controller-manager-7786c744cc-p2gsm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.306941 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" podUID="197f7c4e-f0a3-4f54-95ca-d069391728bf" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.328125 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" podStartSLOduration=5.328107678 podStartE2EDuration="5.328107678s" podCreationTimestamp="2025-12-01 10:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:29.326519273 +0000 UTC m=+208.630277957" watchObservedRunningTime="2025-12-01 10:34:29.328107678 +0000 UTC m=+208.631866302" Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.357056 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" podStartSLOduration=5.357038966 podStartE2EDuration="5.357038966s" podCreationTimestamp="2025-12-01 10:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:29.353971939 +0000 UTC m=+208.657730603" watchObservedRunningTime="2025-12-01 10:34:29.357038966 +0000 UTC m=+208.660797580" Dec 01 10:34:29 crc kubenswrapper[4761]: I1201 10:34:29.368108 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.307558 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nktzw" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerName="registry-server" containerID="cri-o://3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399" gracePeriod=2 Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.314076 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.315189 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.665704 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.725437 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5c886"] Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.725670 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5c886" podUID="5821e59d-de93-43fd-822d-83128ce780de" containerName="registry-server" containerID="cri-o://da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787" gracePeriod=2 Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.749495 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh6cg\" (UniqueName: \"kubernetes.io/projected/95b42196-1572-4e3f-b807-4ef64ed3311f-kube-api-access-qh6cg\") pod \"95b42196-1572-4e3f-b807-4ef64ed3311f\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.750538 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-catalog-content\") pod \"95b42196-1572-4e3f-b807-4ef64ed3311f\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.750570 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-utilities\") pod \"95b42196-1572-4e3f-b807-4ef64ed3311f\" (UID: \"95b42196-1572-4e3f-b807-4ef64ed3311f\") " Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.751464 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-utilities" (OuterVolumeSpecName: "utilities") pod "95b42196-1572-4e3f-b807-4ef64ed3311f" (UID: "95b42196-1572-4e3f-b807-4ef64ed3311f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.759880 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b42196-1572-4e3f-b807-4ef64ed3311f-kube-api-access-qh6cg" (OuterVolumeSpecName: "kube-api-access-qh6cg") pod "95b42196-1572-4e3f-b807-4ef64ed3311f" (UID: "95b42196-1572-4e3f-b807-4ef64ed3311f"). InnerVolumeSpecName "kube-api-access-qh6cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.772906 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95b42196-1572-4e3f-b807-4ef64ed3311f" (UID: "95b42196-1572-4e3f-b807-4ef64ed3311f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.851889 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh6cg\" (UniqueName: \"kubernetes.io/projected/95b42196-1572-4e3f-b807-4ef64ed3311f-kube-api-access-qh6cg\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.851938 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:30 crc kubenswrapper[4761]: I1201 10:34:30.851981 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b42196-1572-4e3f-b807-4ef64ed3311f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.124000 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5c886" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.161133 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-catalog-content\") pod \"5821e59d-de93-43fd-822d-83128ce780de\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.161228 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-utilities\") pod \"5821e59d-de93-43fd-822d-83128ce780de\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.161291 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x8bg\" (UniqueName: \"kubernetes.io/projected/5821e59d-de93-43fd-822d-83128ce780de-kube-api-access-8x8bg\") pod \"5821e59d-de93-43fd-822d-83128ce780de\" (UID: \"5821e59d-de93-43fd-822d-83128ce780de\") " Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.162223 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-utilities" (OuterVolumeSpecName: "utilities") pod "5821e59d-de93-43fd-822d-83128ce780de" (UID: "5821e59d-de93-43fd-822d-83128ce780de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.164838 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5821e59d-de93-43fd-822d-83128ce780de-kube-api-access-8x8bg" (OuterVolumeSpecName: "kube-api-access-8x8bg") pod "5821e59d-de93-43fd-822d-83128ce780de" (UID: "5821e59d-de93-43fd-822d-83128ce780de"). InnerVolumeSpecName "kube-api-access-8x8bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.224050 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5821e59d-de93-43fd-822d-83128ce780de" (UID: "5821e59d-de93-43fd-822d-83128ce780de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.263063 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x8bg\" (UniqueName: \"kubernetes.io/projected/5821e59d-de93-43fd-822d-83128ce780de-kube-api-access-8x8bg\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.263103 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.263117 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5821e59d-de93-43fd-822d-83128ce780de-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.316395 4761 generic.go:334] "Generic (PLEG): container finished" podID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerID="3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399" exitCode=0 Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.316469 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktzw" event={"ID":"95b42196-1572-4e3f-b807-4ef64ed3311f","Type":"ContainerDied","Data":"3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399"} Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.316472 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nktzw" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.316645 4761 scope.go:117] "RemoveContainer" containerID="3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.317062 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktzw" event={"ID":"95b42196-1572-4e3f-b807-4ef64ed3311f","Type":"ContainerDied","Data":"2afe4f14967b1a8a0bdb76005f3dccdc7829cc495c2b5fb169d9e55fc297a6bd"} Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.320953 4761 generic.go:334] "Generic (PLEG): container finished" podID="5821e59d-de93-43fd-822d-83128ce780de" containerID="da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787" exitCode=0 Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.321089 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5c886" event={"ID":"5821e59d-de93-43fd-822d-83128ce780de","Type":"ContainerDied","Data":"da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787"} Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.321128 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5c886" event={"ID":"5821e59d-de93-43fd-822d-83128ce780de","Type":"ContainerDied","Data":"f29837df7d9075929fe3cc29214ed5a4ea3fe80e9ac3d92e2eb41a26d1d429bc"} Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.321672 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5c886" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.341443 4761 scope.go:117] "RemoveContainer" containerID="0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.358350 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktzw"] Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.362383 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktzw"] Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.372218 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5c886"] Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.372438 4761 scope.go:117] "RemoveContainer" containerID="20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.375605 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5c886"] Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.391749 4761 scope.go:117] "RemoveContainer" containerID="3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399" Dec 01 10:34:31 crc kubenswrapper[4761]: E1201 10:34:31.392159 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399\": container with ID starting with 3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399 not found: ID does not exist" containerID="3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.392206 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399"} err="failed to get container status \"3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399\": rpc error: code = NotFound desc = could not find container \"3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399\": container with ID starting with 3e40c938f2fead5592f264151ef3d341a20ad08a057d996b1813b31b9b86c399 not found: ID does not exist" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.392239 4761 scope.go:117] "RemoveContainer" containerID="0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2" Dec 01 10:34:31 crc kubenswrapper[4761]: E1201 10:34:31.392681 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2\": container with ID starting with 0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2 not found: ID does not exist" containerID="0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.392748 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2"} err="failed to get container status \"0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2\": rpc error: code = NotFound desc = could not find container \"0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2\": container with ID starting with 0e010a506204c68f07a2639d9d6611e8fdc5da916df102de278cc535cd3bfda2 not found: ID does not exist" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.392791 4761 scope.go:117] "RemoveContainer" containerID="20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84" Dec 01 10:34:31 crc kubenswrapper[4761]: E1201 10:34:31.393237 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84\": container with ID starting with 20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84 not found: ID does not exist" containerID="20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.393273 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84"} err="failed to get container status \"20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84\": rpc error: code = NotFound desc = could not find container \"20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84\": container with ID starting with 20dc5e5aab8992fcfa0628152740144c398964c3fe0bd94d5031f4deed113a84 not found: ID does not exist" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.393298 4761 scope.go:117] "RemoveContainer" containerID="da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.408141 4761 scope.go:117] "RemoveContainer" containerID="aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.432198 4761 scope.go:117] "RemoveContainer" containerID="c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.462227 4761 scope.go:117] "RemoveContainer" containerID="da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787" Dec 01 10:34:31 crc kubenswrapper[4761]: E1201 10:34:31.463951 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787\": container with ID starting with da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787 not found: ID does not exist" containerID="da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.463990 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787"} err="failed to get container status \"da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787\": rpc error: code = NotFound desc = could not find container \"da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787\": container with ID starting with da5c3d7d0a089bdbccfaac0c4d052c14aeacbd07075168752efff4c90cb55787 not found: ID does not exist" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.464021 4761 scope.go:117] "RemoveContainer" containerID="aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1" Dec 01 10:34:31 crc kubenswrapper[4761]: E1201 10:34:31.464520 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1\": container with ID starting with aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1 not found: ID does not exist" containerID="aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.464568 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1"} err="failed to get container status \"aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1\": rpc error: code = NotFound desc = could not find container \"aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1\": container with ID starting with aea91db9460664c74ce2f6b1f53b0be6cc199c5226111540432fb57d18d272c1 not found: ID does not exist" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.464587 4761 scope.go:117] "RemoveContainer" containerID="c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed" Dec 01 10:34:31 crc kubenswrapper[4761]: E1201 10:34:31.465043 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed\": container with ID starting with c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed not found: ID does not exist" containerID="c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed" Dec 01 10:34:31 crc kubenswrapper[4761]: I1201 10:34:31.465073 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed"} err="failed to get container status \"c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed\": rpc error: code = NotFound desc = could not find container \"c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed\": container with ID starting with c6024f41603904269697db865157e889b50b8168ae793649f536c1d689b2e7ed not found: ID does not exist" Dec 01 10:34:33 crc kubenswrapper[4761]: I1201 10:34:33.140066 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5821e59d-de93-43fd-822d-83128ce780de" path="/var/lib/kubelet/pods/5821e59d-de93-43fd-822d-83128ce780de/volumes" Dec 01 10:34:33 crc kubenswrapper[4761]: I1201 10:34:33.143025 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" path="/var/lib/kubelet/pods/95b42196-1572-4e3f-b807-4ef64ed3311f/volumes" Dec 01 10:34:33 crc kubenswrapper[4761]: I1201 10:34:33.695627 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" podUID="873ee65e-5320-4949-8caa-893b41061408" containerName="oauth-openshift" containerID="cri-o://0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16" gracePeriod=15 Dec 01 10:34:33 crc kubenswrapper[4761]: I1201 10:34:33.850818 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:34:33 crc kubenswrapper[4761]: I1201 10:34:33.850875 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:34:33 crc kubenswrapper[4761]: I1201 10:34:33.850920 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:34:33 crc kubenswrapper[4761]: I1201 10:34:33.851630 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:34:33 crc kubenswrapper[4761]: I1201 10:34:33.851707 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4" gracePeriod=600 Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.159603 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207271 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-router-certs\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207341 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-cliconfig\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207360 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-error\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207388 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-login\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207413 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-provider-selection\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207438 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-trusted-ca-bundle\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207452 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-idp-0-file-data\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207502 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-audit-policies\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207524 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-ocp-branding-template\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207559 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-service-ca\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207584 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/873ee65e-5320-4949-8caa-893b41061408-audit-dir\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207614 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-serving-cert\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207651 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-session\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.207674 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkd7v\" (UniqueName: \"kubernetes.io/projected/873ee65e-5320-4949-8caa-893b41061408-kube-api-access-fkd7v\") pod \"873ee65e-5320-4949-8caa-893b41061408\" (UID: \"873ee65e-5320-4949-8caa-893b41061408\") " Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.209273 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/873ee65e-5320-4949-8caa-893b41061408-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.209330 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.208831 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.209924 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.210247 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.213427 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873ee65e-5320-4949-8caa-893b41061408-kube-api-access-fkd7v" (OuterVolumeSpecName: "kube-api-access-fkd7v") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "kube-api-access-fkd7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.215441 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.215538 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.216784 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.218054 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.218373 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.218420 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.218516 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.218758 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "873ee65e-5320-4949-8caa-893b41061408" (UID: "873ee65e-5320-4949-8caa-893b41061408"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.309403 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.309449 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.309466 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.309480 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.309495 4761 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310068 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310099 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310115 4761 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/873ee65e-5320-4949-8caa-893b41061408-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310128 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310142 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310155 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkd7v\" (UniqueName: \"kubernetes.io/projected/873ee65e-5320-4949-8caa-893b41061408-kube-api-access-fkd7v\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310166 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310178 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.310190 4761 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/873ee65e-5320-4949-8caa-893b41061408-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.350965 4761 generic.go:334] "Generic (PLEG): container finished" podID="873ee65e-5320-4949-8caa-893b41061408" containerID="0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16" exitCode=0 Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.351052 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" event={"ID":"873ee65e-5320-4949-8caa-893b41061408","Type":"ContainerDied","Data":"0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16"} Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.351086 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" event={"ID":"873ee65e-5320-4949-8caa-893b41061408","Type":"ContainerDied","Data":"c30c8365f2ff7d6cc04e51be8e5e3dc117a07ce71f6c49a1ded04bba2fe69d14"} Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.351105 4761 scope.go:117] "RemoveContainer" containerID="0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.351218 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-82k6m" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.353647 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4" exitCode=0 Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.353689 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4"} Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.375389 4761 scope.go:117] "RemoveContainer" containerID="0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16" Dec 01 10:34:34 crc kubenswrapper[4761]: E1201 10:34:34.376262 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16\": container with ID starting with 0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16 not found: ID does not exist" containerID="0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.376366 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16"} err="failed to get container status \"0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16\": rpc error: code = NotFound desc = could not find container \"0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16\": container with ID starting with 0e8434e7f4a3121da102091102a5e31721ce4cce6cec55ff169ddaa33b8aaf16 not found: ID does not exist" Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.386473 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82k6m"] Dec 01 10:34:34 crc kubenswrapper[4761]: I1201 10:34:34.390761 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-82k6m"] Dec 01 10:34:35 crc kubenswrapper[4761]: I1201 10:34:35.136681 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873ee65e-5320-4949-8caa-893b41061408" path="/var/lib/kubelet/pods/873ee65e-5320-4949-8caa-893b41061408/volumes" Dec 01 10:34:35 crc kubenswrapper[4761]: I1201 10:34:35.360951 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"c2d1f25511ab54e969e2db56032fd59e29dcd744fd868077745072a36be032ba"} Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.977080 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh"] Dec 01 10:34:43 crc kubenswrapper[4761]: E1201 10:34:43.977820 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerName="registry-server" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.977835 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerName="registry-server" Dec 01 10:34:43 crc kubenswrapper[4761]: E1201 10:34:43.977845 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873ee65e-5320-4949-8caa-893b41061408" containerName="oauth-openshift" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.977851 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="873ee65e-5320-4949-8caa-893b41061408" containerName="oauth-openshift" Dec 01 10:34:43 crc kubenswrapper[4761]: E1201 10:34:43.977863 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5821e59d-de93-43fd-822d-83128ce780de" containerName="extract-utilities" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.977869 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="5821e59d-de93-43fd-822d-83128ce780de" containerName="extract-utilities" Dec 01 10:34:43 crc kubenswrapper[4761]: E1201 10:34:43.977884 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5821e59d-de93-43fd-822d-83128ce780de" containerName="registry-server" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.977890 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="5821e59d-de93-43fd-822d-83128ce780de" containerName="registry-server" Dec 01 10:34:43 crc kubenswrapper[4761]: E1201 10:34:43.977897 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5821e59d-de93-43fd-822d-83128ce780de" containerName="extract-content" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.977902 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="5821e59d-de93-43fd-822d-83128ce780de" containerName="extract-content" Dec 01 10:34:43 crc kubenswrapper[4761]: E1201 10:34:43.977911 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerName="extract-content" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.977917 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerName="extract-content" Dec 01 10:34:43 crc kubenswrapper[4761]: E1201 10:34:43.977926 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerName="extract-utilities" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.977931 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerName="extract-utilities" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.978036 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="5821e59d-de93-43fd-822d-83128ce780de" containerName="registry-server" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.978052 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="873ee65e-5320-4949-8caa-893b41061408" containerName="oauth-openshift" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.978063 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b42196-1572-4e3f-b807-4ef64ed3311f" containerName="registry-server" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.978426 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.981524 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.981707 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.981937 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.983007 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.983273 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.983284 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.983276 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.983520 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.983529 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.983532 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.984769 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 10:34:43 crc kubenswrapper[4761]: I1201 10:34:43.986564 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.000802 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.001377 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.004131 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.008283 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh"] Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.036609 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-error\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.036922 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.036948 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.036972 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.036989 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-audit-policies\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037035 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-login\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037052 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037067 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfc2f\" (UniqueName: \"kubernetes.io/projected/0a363345-ccc6-4328-a283-007e9b83e4c6-kube-api-access-gfc2f\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037094 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a363345-ccc6-4328-a283-007e9b83e4c6-audit-dir\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037190 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037250 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037308 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-session\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037341 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.037368 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.077992 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7599fffbbc-m2wfl"] Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.078258 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" podUID="1aa17ca4-d38b-42ac-9692-674c0f9d7c37" containerName="controller-manager" containerID="cri-o://34daeaf6d9c4e22aacbeb16f561b9abc25c1e0bea7efac6e282e5ce3b1e37fba" gracePeriod=30 Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.102681 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm"] Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.102911 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" podUID="197f7c4e-f0a3-4f54-95ca-d069391728bf" containerName="route-controller-manager" containerID="cri-o://ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527" gracePeriod=30 Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138116 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138159 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-audit-policies\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138190 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-login\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138215 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138241 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfc2f\" (UniqueName: \"kubernetes.io/projected/0a363345-ccc6-4328-a283-007e9b83e4c6-kube-api-access-gfc2f\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138268 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a363345-ccc6-4328-a283-007e9b83e4c6-audit-dir\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138294 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138319 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138350 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-session\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138367 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138388 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138408 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-error\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138431 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.138450 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.140289 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a363345-ccc6-4328-a283-007e9b83e4c6-audit-dir\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.140359 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.140630 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.141165 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-audit-policies\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.142089 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.143207 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-session\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.143323 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.143686 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.143975 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-error\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.144622 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.151160 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.159363 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfc2f\" (UniqueName: \"kubernetes.io/projected/0a363345-ccc6-4328-a283-007e9b83e4c6-kube-api-access-gfc2f\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.162686 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-login\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.162991 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a363345-ccc6-4328-a283-007e9b83e4c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bc9f7ddc4-hmgbh\" (UID: \"0a363345-ccc6-4328-a283-007e9b83e4c6\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.294016 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.437189 4761 generic.go:334] "Generic (PLEG): container finished" podID="1aa17ca4-d38b-42ac-9692-674c0f9d7c37" containerID="34daeaf6d9c4e22aacbeb16f561b9abc25c1e0bea7efac6e282e5ce3b1e37fba" exitCode=0 Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.437408 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" event={"ID":"1aa17ca4-d38b-42ac-9692-674c0f9d7c37","Type":"ContainerDied","Data":"34daeaf6d9c4e22aacbeb16f561b9abc25c1e0bea7efac6e282e5ce3b1e37fba"} Dec 01 10:34:44 crc kubenswrapper[4761]: I1201 10:34:44.767823 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh"] Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.086522 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.155315 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197f7c4e-f0a3-4f54-95ca-d069391728bf-serving-cert\") pod \"197f7c4e-f0a3-4f54-95ca-d069391728bf\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.155376 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-config\") pod \"197f7c4e-f0a3-4f54-95ca-d069391728bf\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.155449 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnf8r\" (UniqueName: \"kubernetes.io/projected/197f7c4e-f0a3-4f54-95ca-d069391728bf-kube-api-access-cnf8r\") pod \"197f7c4e-f0a3-4f54-95ca-d069391728bf\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.155536 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-client-ca\") pod \"197f7c4e-f0a3-4f54-95ca-d069391728bf\" (UID: \"197f7c4e-f0a3-4f54-95ca-d069391728bf\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.156395 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-config" (OuterVolumeSpecName: "config") pod "197f7c4e-f0a3-4f54-95ca-d069391728bf" (UID: "197f7c4e-f0a3-4f54-95ca-d069391728bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.156736 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-client-ca" (OuterVolumeSpecName: "client-ca") pod "197f7c4e-f0a3-4f54-95ca-d069391728bf" (UID: "197f7c4e-f0a3-4f54-95ca-d069391728bf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.162792 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197f7c4e-f0a3-4f54-95ca-d069391728bf-kube-api-access-cnf8r" (OuterVolumeSpecName: "kube-api-access-cnf8r") pod "197f7c4e-f0a3-4f54-95ca-d069391728bf" (UID: "197f7c4e-f0a3-4f54-95ca-d069391728bf"). InnerVolumeSpecName "kube-api-access-cnf8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.167741 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197f7c4e-f0a3-4f54-95ca-d069391728bf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "197f7c4e-f0a3-4f54-95ca-d069391728bf" (UID: "197f7c4e-f0a3-4f54-95ca-d069391728bf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.193632 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257052 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pss7q\" (UniqueName: \"kubernetes.io/projected/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-kube-api-access-pss7q\") pod \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257131 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-proxy-ca-bundles\") pod \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257157 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-config\") pod \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257197 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-client-ca\") pod \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257249 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-serving-cert\") pod \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\" (UID: \"1aa17ca4-d38b-42ac-9692-674c0f9d7c37\") " Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257472 4761 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257494 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197f7c4e-f0a3-4f54-95ca-d069391728bf-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257507 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197f7c4e-f0a3-4f54-95ca-d069391728bf-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.257520 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnf8r\" (UniqueName: \"kubernetes.io/projected/197f7c4e-f0a3-4f54-95ca-d069391728bf-kube-api-access-cnf8r\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.258099 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1aa17ca4-d38b-42ac-9692-674c0f9d7c37" (UID: "1aa17ca4-d38b-42ac-9692-674c0f9d7c37"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.258195 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-config" (OuterVolumeSpecName: "config") pod "1aa17ca4-d38b-42ac-9692-674c0f9d7c37" (UID: "1aa17ca4-d38b-42ac-9692-674c0f9d7c37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.258115 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-client-ca" (OuterVolumeSpecName: "client-ca") pod "1aa17ca4-d38b-42ac-9692-674c0f9d7c37" (UID: "1aa17ca4-d38b-42ac-9692-674c0f9d7c37"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.260652 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-kube-api-access-pss7q" (OuterVolumeSpecName: "kube-api-access-pss7q") pod "1aa17ca4-d38b-42ac-9692-674c0f9d7c37" (UID: "1aa17ca4-d38b-42ac-9692-674c0f9d7c37"). InnerVolumeSpecName "kube-api-access-pss7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.261086 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1aa17ca4-d38b-42ac-9692-674c0f9d7c37" (UID: "1aa17ca4-d38b-42ac-9692-674c0f9d7c37"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.358355 4761 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.358395 4761 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.358405 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pss7q\" (UniqueName: \"kubernetes.io/projected/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-kube-api-access-pss7q\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.358414 4761 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.358423 4761 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aa17ca4-d38b-42ac-9692-674c0f9d7c37-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.444947 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.444940 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7599fffbbc-m2wfl" event={"ID":"1aa17ca4-d38b-42ac-9692-674c0f9d7c37","Type":"ContainerDied","Data":"69b6a40719b990d8e9ff95e07520b183377cc59aa1126860e13fc14549979720"} Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.445111 4761 scope.go:117] "RemoveContainer" containerID="34daeaf6d9c4e22aacbeb16f561b9abc25c1e0bea7efac6e282e5ce3b1e37fba" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.446575 4761 generic.go:334] "Generic (PLEG): container finished" podID="197f7c4e-f0a3-4f54-95ca-d069391728bf" containerID="ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527" exitCode=0 Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.446678 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.447102 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" event={"ID":"197f7c4e-f0a3-4f54-95ca-d069391728bf","Type":"ContainerDied","Data":"ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527"} Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.447157 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm" event={"ID":"197f7c4e-f0a3-4f54-95ca-d069391728bf","Type":"ContainerDied","Data":"7191dbc3d405384a5edb42f4912e64092429ddcc2d749b93e4ef6af041d30936"} Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.449034 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" event={"ID":"0a363345-ccc6-4328-a283-007e9b83e4c6","Type":"ContainerStarted","Data":"c24b8a073e999a198e0f21335b2ec4410de9dce52146c78d06f5d1b4b5a075dd"} Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.449063 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" event={"ID":"0a363345-ccc6-4328-a283-007e9b83e4c6","Type":"ContainerStarted","Data":"853cfca56b15c6c9fef8e38e0de5f2be99926c49179ccac670f1a337f208f45e"} Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.450476 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.470690 4761 scope.go:117] "RemoveContainer" containerID="ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.477636 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" podStartSLOduration=37.477620116 podStartE2EDuration="37.477620116s" podCreationTimestamp="2025-12-01 10:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:45.475641167 +0000 UTC m=+224.779399781" watchObservedRunningTime="2025-12-01 10:34:45.477620116 +0000 UTC m=+224.781378740" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.490315 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7599fffbbc-m2wfl"] Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.490866 4761 scope.go:117] "RemoveContainer" containerID="ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527" Dec 01 10:34:45 crc kubenswrapper[4761]: E1201 10:34:45.491941 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527\": container with ID starting with ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527 not found: ID does not exist" containerID="ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.492055 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527"} err="failed to get container status \"ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527\": rpc error: code = NotFound desc = could not find container \"ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527\": container with ID starting with ab7ce41d126830d742f9f9fde3cd92e10dc6645b30072bd285ec42efba35f527 not found: ID does not exist" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.495434 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7599fffbbc-m2wfl"] Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.504335 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm"] Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.513391 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7786c744cc-p2gsm"] Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.523471 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-hmgbh" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.976343 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r"] Dec 01 10:34:45 crc kubenswrapper[4761]: E1201 10:34:45.976984 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa17ca4-d38b-42ac-9692-674c0f9d7c37" containerName="controller-manager" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.977038 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa17ca4-d38b-42ac-9692-674c0f9d7c37" containerName="controller-manager" Dec 01 10:34:45 crc kubenswrapper[4761]: E1201 10:34:45.977076 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197f7c4e-f0a3-4f54-95ca-d069391728bf" containerName="route-controller-manager" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.977086 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="197f7c4e-f0a3-4f54-95ca-d069391728bf" containerName="route-controller-manager" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.977207 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa17ca4-d38b-42ac-9692-674c0f9d7c37" containerName="controller-manager" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.977223 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="197f7c4e-f0a3-4f54-95ca-d069391728bf" containerName="route-controller-manager" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.977682 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.979600 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.979963 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.979977 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.980531 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.981312 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.987932 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r"] Dec 01 10:34:45 crc kubenswrapper[4761]: I1201 10:34:45.991464 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.076768 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hxp5\" (UniqueName: \"kubernetes.io/projected/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-kube-api-access-6hxp5\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.077029 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-config\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.077104 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-client-ca\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.077170 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-serving-cert\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.178513 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-config\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.178580 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-client-ca\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.178606 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-serving-cert\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.178661 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxp5\" (UniqueName: \"kubernetes.io/projected/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-kube-api-access-6hxp5\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.179870 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-client-ca\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.180634 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-config\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.183589 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-serving-cert\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.205719 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxp5\" (UniqueName: \"kubernetes.io/projected/50745ad4-fb7b-4a6f-acdc-f63ff30f6e56-kube-api-access-6hxp5\") pod \"route-controller-manager-67bff78b64-cgg5r\" (UID: \"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56\") " pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.299199 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:46 crc kubenswrapper[4761]: I1201 10:34:46.792311 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r"] Dec 01 10:34:46 crc kubenswrapper[4761]: W1201 10:34:46.801230 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50745ad4_fb7b_4a6f_acdc_f63ff30f6e56.slice/crio-7124e6cc6d2cfe99a4ee4fff678293d65058decca7ac611fbf220d8b379569a6 WatchSource:0}: Error finding container 7124e6cc6d2cfe99a4ee4fff678293d65058decca7ac611fbf220d8b379569a6: Status 404 returned error can't find the container with id 7124e6cc6d2cfe99a4ee4fff678293d65058decca7ac611fbf220d8b379569a6 Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.140158 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="197f7c4e-f0a3-4f54-95ca-d069391728bf" path="/var/lib/kubelet/pods/197f7c4e-f0a3-4f54-95ca-d069391728bf/volumes" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.140826 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa17ca4-d38b-42ac-9692-674c0f9d7c37" path="/var/lib/kubelet/pods/1aa17ca4-d38b-42ac-9692-674c0f9d7c37/volumes" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.467314 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" event={"ID":"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56","Type":"ContainerStarted","Data":"aa8507ab1c8f902c2aca6efee83f21ff69ed1aced31cd35dfa971b5a67969998"} Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.467373 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" event={"ID":"50745ad4-fb7b-4a6f-acdc-f63ff30f6e56","Type":"ContainerStarted","Data":"7124e6cc6d2cfe99a4ee4fff678293d65058decca7ac611fbf220d8b379569a6"} Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.467629 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.472372 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.485944 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67bff78b64-cgg5r" podStartSLOduration=3.485926165 podStartE2EDuration="3.485926165s" podCreationTimestamp="2025-12-01 10:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:47.483071066 +0000 UTC m=+226.786829690" watchObservedRunningTime="2025-12-01 10:34:47.485926165 +0000 UTC m=+226.789684789" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.980448 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-958894786-fg5rv"] Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.981581 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.986706 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.987105 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.987306 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.987386 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.987694 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.993101 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.995134 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:34:47 crc kubenswrapper[4761]: I1201 10:34:47.995435 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-958894786-fg5rv"] Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.105842 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6qb9\" (UniqueName: \"kubernetes.io/projected/cc44856d-b3d2-4928-bde7-30e851bf0cb1-kube-api-access-j6qb9\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.105914 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-client-ca\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.105938 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-config\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.105957 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-proxy-ca-bundles\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.105984 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc44856d-b3d2-4928-bde7-30e851bf0cb1-serving-cert\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.208246 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6qb9\" (UniqueName: \"kubernetes.io/projected/cc44856d-b3d2-4928-bde7-30e851bf0cb1-kube-api-access-j6qb9\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.208347 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-client-ca\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.208395 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-config\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.208451 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-proxy-ca-bundles\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.208508 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc44856d-b3d2-4928-bde7-30e851bf0cb1-serving-cert\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.209828 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-client-ca\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.210154 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-proxy-ca-bundles\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.210629 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc44856d-b3d2-4928-bde7-30e851bf0cb1-config\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.222134 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc44856d-b3d2-4928-bde7-30e851bf0cb1-serving-cert\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.223286 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6qb9\" (UniqueName: \"kubernetes.io/projected/cc44856d-b3d2-4928-bde7-30e851bf0cb1-kube-api-access-j6qb9\") pod \"controller-manager-958894786-fg5rv\" (UID: \"cc44856d-b3d2-4928-bde7-30e851bf0cb1\") " pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.303485 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:48 crc kubenswrapper[4761]: I1201 10:34:48.723060 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-958894786-fg5rv"] Dec 01 10:34:48 crc kubenswrapper[4761]: W1201 10:34:48.733811 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc44856d_b3d2_4928_bde7_30e851bf0cb1.slice/crio-c481dcfbb47142bcb78122788b23a12ce954fb9527a80dfd51296e446109294f WatchSource:0}: Error finding container c481dcfbb47142bcb78122788b23a12ce954fb9527a80dfd51296e446109294f: Status 404 returned error can't find the container with id c481dcfbb47142bcb78122788b23a12ce954fb9527a80dfd51296e446109294f Dec 01 10:34:49 crc kubenswrapper[4761]: I1201 10:34:49.482041 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-958894786-fg5rv" event={"ID":"cc44856d-b3d2-4928-bde7-30e851bf0cb1","Type":"ContainerStarted","Data":"5d48c576c75c861780153f746a8d7d3b4a7140ee988ed2f91a0efd12f4bfe982"} Dec 01 10:34:49 crc kubenswrapper[4761]: I1201 10:34:49.482108 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-958894786-fg5rv" event={"ID":"cc44856d-b3d2-4928-bde7-30e851bf0cb1","Type":"ContainerStarted","Data":"c481dcfbb47142bcb78122788b23a12ce954fb9527a80dfd51296e446109294f"} Dec 01 10:34:49 crc kubenswrapper[4761]: I1201 10:34:49.504281 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-958894786-fg5rv" podStartSLOduration=5.504263005 podStartE2EDuration="5.504263005s" podCreationTimestamp="2025-12-01 10:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:49.501209909 +0000 UTC m=+228.804968573" watchObservedRunningTime="2025-12-01 10:34:49.504263005 +0000 UTC m=+228.808021639" Dec 01 10:34:50 crc kubenswrapper[4761]: I1201 10:34:50.488378 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:50 crc kubenswrapper[4761]: I1201 10:34:50.493533 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-958894786-fg5rv" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.381428 4761 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.382247 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.382273 4761 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.382801 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592" gracePeriod=15 Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.382855 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69" gracePeriod=15 Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.383049 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d" gracePeriod=15 Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.383007 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735" gracePeriod=15 Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.383194 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655" gracePeriod=15 Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384075 4761 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:34:51 crc kubenswrapper[4761]: E1201 10:34:51.384200 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384213 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:34:51 crc kubenswrapper[4761]: E1201 10:34:51.384221 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384227 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:34:51 crc kubenswrapper[4761]: E1201 10:34:51.384239 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384245 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:34:51 crc kubenswrapper[4761]: E1201 10:34:51.384254 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384261 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:34:51 crc kubenswrapper[4761]: E1201 10:34:51.384269 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384275 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:34:51 crc kubenswrapper[4761]: E1201 10:34:51.384283 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384288 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384395 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384406 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384414 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384424 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.384433 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:34:51 crc kubenswrapper[4761]: E1201 10:34:51.385810 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.385844 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.386231 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.426469 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.455483 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.455585 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.455615 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.455647 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.455670 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.455722 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.455746 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.455779 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556645 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556704 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556746 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556756 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556773 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556813 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556827 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556844 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556847 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556860 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556875 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556915 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.556972 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.557130 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.557173 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.557181 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:51 crc kubenswrapper[4761]: I1201 10:34:51.720840 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:34:52 crc kubenswrapper[4761]: I1201 10:34:52.521275 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3113cd3cceb9d8f2523d4fec5c8807e7ac7453096ee20b0dc0500e98c6d8d4c8"} Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.528402 4761 generic.go:334] "Generic (PLEG): container finished" podID="57880e23-a871-459f-8c11-2d59e61e2eaf" containerID="df4ce2c3ffadc7b6df474b8eac51389ad05aeb25f7eb9b5d2e6161c2a93319ff" exitCode=0 Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.528522 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"57880e23-a871-459f-8c11-2d59e61e2eaf","Type":"ContainerDied","Data":"df4ce2c3ffadc7b6df474b8eac51389ad05aeb25f7eb9b5d2e6161c2a93319ff"} Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.529720 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.530308 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.531034 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25"} Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.532268 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.533013 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.533744 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.535361 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.536451 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592" exitCode=0 Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.536493 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735" exitCode=0 Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.536504 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655" exitCode=0 Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.536516 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d" exitCode=2 Dec 01 10:34:53 crc kubenswrapper[4761]: I1201 10:34:53.536607 4761 scope.go:117] "RemoveContainer" containerID="6dc500373aa3722b8b4bcfa76840976b6580df39c6759d6800ad259292be9cec" Dec 01 10:34:54 crc kubenswrapper[4761]: E1201 10:34:54.272913 4761 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: E1201 10:34:54.273528 4761 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: E1201 10:34:54.274057 4761 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: E1201 10:34:54.274435 4761 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: E1201 10:34:54.274882 4761 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.274920 4761 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 10:34:54 crc kubenswrapper[4761]: E1201 10:34:54.275275 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="200ms" Dec 01 10:34:54 crc kubenswrapper[4761]: E1201 10:34:54.477011 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="400ms" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.546149 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.871314 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.872625 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.873898 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.874403 4761 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.874757 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: E1201 10:34:54.878191 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="800ms" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.907652 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.907688 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.907786 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.907819 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.907837 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.907951 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.908079 4761 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.908093 4761 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.908101 4761 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.961899 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.962655 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.963120 4761 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:54 crc kubenswrapper[4761]: I1201 10:34:54.963373 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.009474 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-var-lock\") pod \"57880e23-a871-459f-8c11-2d59e61e2eaf\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.009528 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-kubelet-dir\") pod \"57880e23-a871-459f-8c11-2d59e61e2eaf\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.009707 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57880e23-a871-459f-8c11-2d59e61e2eaf-kube-api-access\") pod \"57880e23-a871-459f-8c11-2d59e61e2eaf\" (UID: \"57880e23-a871-459f-8c11-2d59e61e2eaf\") " Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.009888 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "57880e23-a871-459f-8c11-2d59e61e2eaf" (UID: "57880e23-a871-459f-8c11-2d59e61e2eaf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.009886 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-var-lock" (OuterVolumeSpecName: "var-lock") pod "57880e23-a871-459f-8c11-2d59e61e2eaf" (UID: "57880e23-a871-459f-8c11-2d59e61e2eaf"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.010020 4761 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.015949 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57880e23-a871-459f-8c11-2d59e61e2eaf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "57880e23-a871-459f-8c11-2d59e61e2eaf" (UID: "57880e23-a871-459f-8c11-2d59e61e2eaf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.111997 4761 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57880e23-a871-459f-8c11-2d59e61e2eaf-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.112065 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57880e23-a871-459f-8c11-2d59e61e2eaf-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.138948 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.556843 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"57880e23-a871-459f-8c11-2d59e61e2eaf","Type":"ContainerDied","Data":"5efc0e400b0dcc59e524d11fe1054e6ec524bbb29eedef43ab7818265f4f02d9"} Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.556896 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5efc0e400b0dcc59e524d11fe1054e6ec524bbb29eedef43ab7818265f4f02d9" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.556866 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.562109 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.563154 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.565946 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.567188 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69" exitCode=0 Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.567271 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.567770 4761 scope.go:117] "RemoveContainer" containerID="ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.568725 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.569125 4761 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.571632 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.575388 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.575857 4761 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.576597 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.590642 4761 scope.go:117] "RemoveContainer" containerID="25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.609837 4761 scope.go:117] "RemoveContainer" containerID="8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.630879 4761 scope.go:117] "RemoveContainer" containerID="275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.650060 4761 scope.go:117] "RemoveContainer" containerID="96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.666045 4761 scope.go:117] "RemoveContainer" containerID="80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea" Dec 01 10:34:55 crc kubenswrapper[4761]: E1201 10:34:55.678694 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="1.6s" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.685177 4761 scope.go:117] "RemoveContainer" containerID="ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592" Dec 01 10:34:55 crc kubenswrapper[4761]: E1201 10:34:55.685837 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\": container with ID starting with ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592 not found: ID does not exist" containerID="ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.685914 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592"} err="failed to get container status \"ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\": rpc error: code = NotFound desc = could not find container \"ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592\": container with ID starting with ba6d475230082e42185565224e3c24a79c9073e6ee102e3ddab8544ab0c1f592 not found: ID does not exist" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.685948 4761 scope.go:117] "RemoveContainer" containerID="25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735" Dec 01 10:34:55 crc kubenswrapper[4761]: E1201 10:34:55.686981 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\": container with ID starting with 25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735 not found: ID does not exist" containerID="25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.687002 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735"} err="failed to get container status \"25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\": rpc error: code = NotFound desc = could not find container \"25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735\": container with ID starting with 25cffe0ce4480bfb9bbebc2cff8f174a0d20d1e37510fd4069b3512fe8552735 not found: ID does not exist" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.687015 4761 scope.go:117] "RemoveContainer" containerID="8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655" Dec 01 10:34:55 crc kubenswrapper[4761]: E1201 10:34:55.687712 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\": container with ID starting with 8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655 not found: ID does not exist" containerID="8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.687775 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655"} err="failed to get container status \"8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\": rpc error: code = NotFound desc = could not find container \"8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655\": container with ID starting with 8f12ee6cc04920e1bb30cd6348345bfb96044ad123b915dede69580466a7c655 not found: ID does not exist" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.687810 4761 scope.go:117] "RemoveContainer" containerID="275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d" Dec 01 10:34:55 crc kubenswrapper[4761]: E1201 10:34:55.689067 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\": container with ID starting with 275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d not found: ID does not exist" containerID="275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.689113 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d"} err="failed to get container status \"275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\": rpc error: code = NotFound desc = could not find container \"275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d\": container with ID starting with 275da5ed0c6928942edb424df37779f69395ec8b8d7339fde1fd3caf7fda218d not found: ID does not exist" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.689160 4761 scope.go:117] "RemoveContainer" containerID="96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69" Dec 01 10:34:55 crc kubenswrapper[4761]: E1201 10:34:55.689833 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\": container with ID starting with 96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69 not found: ID does not exist" containerID="96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.689914 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69"} err="failed to get container status \"96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\": rpc error: code = NotFound desc = could not find container \"96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69\": container with ID starting with 96815f42b581ab656a3dcb3f89aa75224ed8936d9fda94196b28fda12e6f2a69 not found: ID does not exist" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.689950 4761 scope.go:117] "RemoveContainer" containerID="80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea" Dec 01 10:34:55 crc kubenswrapper[4761]: E1201 10:34:55.690390 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\": container with ID starting with 80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea not found: ID does not exist" containerID="80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea" Dec 01 10:34:55 crc kubenswrapper[4761]: I1201 10:34:55.690423 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea"} err="failed to get container status \"80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\": rpc error: code = NotFound desc = could not find container \"80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea\": container with ID starting with 80008e72f4891d8589781d74860efc9c6ccd20bfc9ce850df6658bdfaf1011ea not found: ID does not exist" Dec 01 10:34:57 crc kubenswrapper[4761]: E1201 10:34:57.280162 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="3.2s" Dec 01 10:34:57 crc kubenswrapper[4761]: E1201 10:34:57.530074 4761 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.88:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d10f970f7477c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:34:52.527257468 +0000 UTC m=+231.831016092,LastTimestamp:2025-12-01 10:34:52.527257468 +0000 UTC m=+231.831016092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:34:58 crc kubenswrapper[4761]: E1201 10:34:58.373845 4761 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.88:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d10f970f7477c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:34:52.527257468 +0000 UTC m=+231.831016092,LastTimestamp:2025-12-01 10:34:52.527257468 +0000 UTC m=+231.831016092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:34:59 crc kubenswrapper[4761]: E1201 10:34:59.596601 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:34:59Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:34:59Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:34:59Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:34:59Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:59 crc kubenswrapper[4761]: E1201 10:34:59.597751 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:59 crc kubenswrapper[4761]: E1201 10:34:59.598321 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:59 crc kubenswrapper[4761]: E1201 10:34:59.598771 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:59 crc kubenswrapper[4761]: E1201 10:34:59.599177 4761 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:34:59 crc kubenswrapper[4761]: E1201 10:34:59.599202 4761 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:35:00 crc kubenswrapper[4761]: E1201 10:35:00.481813 4761 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.88:6443: connect: connection refused" interval="6.4s" Dec 01 10:35:01 crc kubenswrapper[4761]: I1201 10:35:01.134711 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:35:01 crc kubenswrapper[4761]: I1201 10:35:01.135484 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.128688 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.131090 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.131637 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.144871 4761 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.144927 4761 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:02 crc kubenswrapper[4761]: E1201 10:35:02.145914 4761 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.146446 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:02 crc kubenswrapper[4761]: W1201 10:35:02.167254 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-7d2daaf19e62bcad1a0792515bd568ee9f814b7b342671412d157c05b9423c96 WatchSource:0}: Error finding container 7d2daaf19e62bcad1a0792515bd568ee9f814b7b342671412d157c05b9423c96: Status 404 returned error can't find the container with id 7d2daaf19e62bcad1a0792515bd568ee9f814b7b342671412d157c05b9423c96 Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.615854 4761 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="46e7c83afa0d416c79e57f3b44b63be90c9725c9588fd01e0095fdb9ac668299" exitCode=0 Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.616039 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"46e7c83afa0d416c79e57f3b44b63be90c9725c9588fd01e0095fdb9ac668299"} Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.616826 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7d2daaf19e62bcad1a0792515bd568ee9f814b7b342671412d157c05b9423c96"} Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.617503 4761 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.617621 4761 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.617912 4761 status_manager.go:851] "Failed to get status for pod" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:35:02 crc kubenswrapper[4761]: E1201 10:35:02.618163 4761 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:02 crc kubenswrapper[4761]: I1201 10:35:02.618392 4761 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.88:6443: connect: connection refused" Dec 01 10:35:03 crc kubenswrapper[4761]: I1201 10:35:03.628355 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"afa30df8805d804b8ec65f07b95d1a3a4abe7971a3c20abca10b964bc9f4f656"} Dec 01 10:35:03 crc kubenswrapper[4761]: I1201 10:35:03.629989 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"217e9a0ff448b0a9336a45ea0f4ce2679b9f6171eaf5fe84488bf0a9542c18ff"} Dec 01 10:35:03 crc kubenswrapper[4761]: I1201 10:35:03.630117 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d0bf5ab609d8d114eb9e2e407eed9cd656c5871aeadcbb0630bbebd0bf6397a9"} Dec 01 10:35:04 crc kubenswrapper[4761]: I1201 10:35:04.635586 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2608bcebf84a24507fc805c5e99cc629cb62794902d0d7ebc0ac8a61c6c6c247"} Dec 01 10:35:04 crc kubenswrapper[4761]: I1201 10:35:04.635893 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:04 crc kubenswrapper[4761]: I1201 10:35:04.635907 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b88ea3e00d12938ed8b5986028fded8dafabd17146e61f4e65cd054af3829fd3"} Dec 01 10:35:04 crc kubenswrapper[4761]: I1201 10:35:04.635834 4761 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:04 crc kubenswrapper[4761]: I1201 10:35:04.635926 4761 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:06 crc kubenswrapper[4761]: I1201 10:35:06.649783 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 10:35:06 crc kubenswrapper[4761]: I1201 10:35:06.650114 4761 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398" exitCode=1 Dec 01 10:35:06 crc kubenswrapper[4761]: I1201 10:35:06.650146 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398"} Dec 01 10:35:06 crc kubenswrapper[4761]: I1201 10:35:06.650691 4761 scope.go:117] "RemoveContainer" containerID="66524b14e523f3956c66e80d89044c5f0383bd10126f275170c13bb64a9a7398" Dec 01 10:35:07 crc kubenswrapper[4761]: I1201 10:35:07.147223 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:07 crc kubenswrapper[4761]: I1201 10:35:07.147272 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:07 crc kubenswrapper[4761]: I1201 10:35:07.155670 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:07 crc kubenswrapper[4761]: I1201 10:35:07.664282 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 10:35:07 crc kubenswrapper[4761]: I1201 10:35:07.664731 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ddad3f216b7a37538df9d351315f603e39132a9cb89cb351af868e235a424f1e"} Dec 01 10:35:08 crc kubenswrapper[4761]: I1201 10:35:08.035830 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:35:08 crc kubenswrapper[4761]: I1201 10:35:08.041636 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:35:08 crc kubenswrapper[4761]: I1201 10:35:08.673153 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:35:09 crc kubenswrapper[4761]: I1201 10:35:09.646886 4761 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:09 crc kubenswrapper[4761]: I1201 10:35:09.677568 4761 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:09 crc kubenswrapper[4761]: I1201 10:35:09.677601 4761 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:09 crc kubenswrapper[4761]: I1201 10:35:09.681495 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:10 crc kubenswrapper[4761]: I1201 10:35:10.732887 4761 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:10 crc kubenswrapper[4761]: I1201 10:35:10.733283 4761 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="401f34d6-1db1-49fc-b016-73a397bcd9d9" Dec 01 10:35:11 crc kubenswrapper[4761]: I1201 10:35:11.153207 4761 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1f6a1066-ccd0-4797-9938-317b614b6b8f" Dec 01 10:35:15 crc kubenswrapper[4761]: I1201 10:35:15.691062 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 10:35:15 crc kubenswrapper[4761]: I1201 10:35:15.887057 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 10:35:15 crc kubenswrapper[4761]: I1201 10:35:15.934284 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 10:35:15 crc kubenswrapper[4761]: I1201 10:35:15.953353 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.008628 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.141512 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.161505 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.345991 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.402803 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.442131 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.518150 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.625907 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.656410 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.697100 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.762056 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.829410 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.837371 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.914454 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.937293 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.978623 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 10:35:16 crc kubenswrapper[4761]: I1201 10:35:16.993148 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.027112 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.063535 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.141388 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.154177 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.162347 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.363048 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.465660 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.495654 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.541380 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.542203 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.583265 4761 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.679804 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.713991 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.819609 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.878725 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.889914 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.903209 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 10:35:17 crc kubenswrapper[4761]: I1201 10:35:17.995538 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.049412 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.177272 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.246044 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.393569 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.467104 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.536960 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.560268 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.621864 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.708310 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.759024 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.765883 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 10:35:18 crc kubenswrapper[4761]: I1201 10:35:18.884906 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.144843 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.171699 4761 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.186470 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.190415 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.197511 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.275784 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.301603 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.370115 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.398511 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.414197 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.528879 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.552060 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.559534 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.630996 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.641300 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.712971 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.842830 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.904174 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 10:35:19 crc kubenswrapper[4761]: I1201 10:35:19.988858 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.061941 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.061963 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.096600 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.107705 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.116216 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.130466 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.135945 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.166432 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.169113 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.214125 4761 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.216220 4761 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.274757 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.333395 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.487335 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.549490 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.657620 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.689909 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.746516 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.762267 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.786245 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.838663 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.886811 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.939053 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.953953 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.985631 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 10:35:20 crc kubenswrapper[4761]: I1201 10:35:20.999351 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.045334 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.056178 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.109239 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.141317 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.215300 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.247343 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.257679 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.299352 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.369193 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.379452 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.472540 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.516528 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.520446 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.523974 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.656369 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.693532 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.727733 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.763953 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.783796 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.809229 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.820953 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.828332 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.837313 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.908171 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.909801 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.944886 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 10:35:21 crc kubenswrapper[4761]: I1201 10:35:21.980645 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 10:35:22 crc kubenswrapper[4761]: I1201 10:35:22.009094 4761 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 10:35:22 crc kubenswrapper[4761]: I1201 10:35:22.299896 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 10:35:22 crc kubenswrapper[4761]: I1201 10:35:22.605431 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 10:35:23 crc kubenswrapper[4761]: I1201 10:35:23.188887 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 10:35:23 crc kubenswrapper[4761]: I1201 10:35:23.388274 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 10:35:23 crc kubenswrapper[4761]: I1201 10:35:23.586692 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 10:35:23 crc kubenswrapper[4761]: I1201 10:35:23.751925 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 10:35:24 crc kubenswrapper[4761]: I1201 10:35:24.447044 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 10:35:24 crc kubenswrapper[4761]: I1201 10:35:24.531695 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 10:35:24 crc kubenswrapper[4761]: I1201 10:35:24.536489 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 10:35:24 crc kubenswrapper[4761]: I1201 10:35:24.860903 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 10:35:24 crc kubenswrapper[4761]: I1201 10:35:24.901518 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 10:35:25 crc kubenswrapper[4761]: I1201 10:35:25.364130 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 10:35:25 crc kubenswrapper[4761]: I1201 10:35:25.636604 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 10:35:25 crc kubenswrapper[4761]: I1201 10:35:25.708157 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 10:35:25 crc kubenswrapper[4761]: I1201 10:35:25.858426 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:35:25 crc kubenswrapper[4761]: I1201 10:35:25.926993 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.062131 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.064969 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.153099 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.175218 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.339264 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.410607 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.456897 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.574510 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.649625 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.786519 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:35:26 crc kubenswrapper[4761]: I1201 10:35:26.910861 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 10:35:27 crc kubenswrapper[4761]: I1201 10:35:27.002018 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 10:35:27 crc kubenswrapper[4761]: I1201 10:35:27.243698 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:35:27 crc kubenswrapper[4761]: I1201 10:35:27.324630 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 10:35:27 crc kubenswrapper[4761]: I1201 10:35:27.329651 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:35:27 crc kubenswrapper[4761]: I1201 10:35:27.454689 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 10:35:27 crc kubenswrapper[4761]: I1201 10:35:27.777005 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 10:35:27 crc kubenswrapper[4761]: I1201 10:35:27.810115 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 10:35:27 crc kubenswrapper[4761]: I1201 10:35:27.923165 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 10:35:28 crc kubenswrapper[4761]: I1201 10:35:28.226494 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 10:35:28 crc kubenswrapper[4761]: I1201 10:35:28.228205 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 10:35:28 crc kubenswrapper[4761]: I1201 10:35:28.390677 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 10:35:28 crc kubenswrapper[4761]: I1201 10:35:28.716709 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 10:35:28 crc kubenswrapper[4761]: I1201 10:35:28.862915 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 10:35:28 crc kubenswrapper[4761]: I1201 10:35:28.931351 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.210987 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.265000 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.286539 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.306192 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.574256 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.763207 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.770777 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.811939 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:35:29 crc kubenswrapper[4761]: I1201 10:35:29.836591 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.046369 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.143792 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.153964 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.157503 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.221396 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.283511 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.335235 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.393498 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.521717 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.564602 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.663116 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.731307 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.745745 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.800939 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.887827 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 10:35:30 crc kubenswrapper[4761]: I1201 10:35:30.969188 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 10:35:31 crc kubenswrapper[4761]: I1201 10:35:31.341972 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 10:35:31 crc kubenswrapper[4761]: I1201 10:35:31.388449 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 10:35:31 crc kubenswrapper[4761]: I1201 10:35:31.421128 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:35:31 crc kubenswrapper[4761]: I1201 10:35:31.663514 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 10:35:31 crc kubenswrapper[4761]: I1201 10:35:31.811920 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.087079 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.100795 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.180198 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.230761 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.292120 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.405295 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.566736 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.578683 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.589422 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.740971 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.757302 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 10:35:32 crc kubenswrapper[4761]: I1201 10:35:32.887330 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 10:35:33 crc kubenswrapper[4761]: I1201 10:35:33.239766 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 10:35:33 crc kubenswrapper[4761]: I1201 10:35:33.400428 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 10:35:33 crc kubenswrapper[4761]: I1201 10:35:33.411750 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:35:33 crc kubenswrapper[4761]: I1201 10:35:33.426695 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 10:35:33 crc kubenswrapper[4761]: I1201 10:35:33.455111 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 10:35:33 crc kubenswrapper[4761]: I1201 10:35:33.636980 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 10:35:33 crc kubenswrapper[4761]: I1201 10:35:33.922995 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 10:35:33 crc kubenswrapper[4761]: I1201 10:35:33.937787 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.041700 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.093663 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.126001 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.267077 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.376833 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.600682 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.664718 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.698346 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.711680 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.757676 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.836661 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.901156 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.925006 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.930983 4761 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.931401 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=43.931378567 podStartE2EDuration="43.931378567s" podCreationTimestamp="2025-12-01 10:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:35:09.676627979 +0000 UTC m=+248.980386603" watchObservedRunningTime="2025-12-01 10:35:34.931378567 +0000 UTC m=+274.235137191" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.936870 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.936946 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.941085 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:34 crc kubenswrapper[4761]: I1201 10:35:34.956769 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.956745821 podStartE2EDuration="25.956745821s" podCreationTimestamp="2025-12-01 10:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:35:34.954257146 +0000 UTC m=+274.258015770" watchObservedRunningTime="2025-12-01 10:35:34.956745821 +0000 UTC m=+274.260504445" Dec 01 10:35:35 crc kubenswrapper[4761]: I1201 10:35:35.089710 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 10:35:35 crc kubenswrapper[4761]: I1201 10:35:35.164967 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 10:35:35 crc kubenswrapper[4761]: I1201 10:35:35.233640 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 10:35:35 crc kubenswrapper[4761]: I1201 10:35:35.396674 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 10:35:35 crc kubenswrapper[4761]: I1201 10:35:35.421052 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 10:35:35 crc kubenswrapper[4761]: I1201 10:35:35.454493 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 10:35:35 crc kubenswrapper[4761]: I1201 10:35:35.461756 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 10:35:35 crc kubenswrapper[4761]: I1201 10:35:35.799588 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 10:35:36 crc kubenswrapper[4761]: I1201 10:35:36.156041 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 10:35:36 crc kubenswrapper[4761]: I1201 10:35:36.408477 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 10:35:36 crc kubenswrapper[4761]: I1201 10:35:36.442285 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 10:35:36 crc kubenswrapper[4761]: I1201 10:35:36.599599 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 10:35:36 crc kubenswrapper[4761]: I1201 10:35:36.740205 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 10:35:37 crc kubenswrapper[4761]: I1201 10:35:37.115339 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 10:35:37 crc kubenswrapper[4761]: I1201 10:35:37.673012 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 10:35:37 crc kubenswrapper[4761]: I1201 10:35:37.812874 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 10:35:37 crc kubenswrapper[4761]: I1201 10:35:37.829240 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 10:35:37 crc kubenswrapper[4761]: I1201 10:35:37.953948 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 10:35:38 crc kubenswrapper[4761]: I1201 10:35:38.011455 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 10:35:38 crc kubenswrapper[4761]: I1201 10:35:38.673331 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 10:35:39 crc kubenswrapper[4761]: I1201 10:35:39.118999 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 10:35:39 crc kubenswrapper[4761]: I1201 10:35:39.120313 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 10:35:39 crc kubenswrapper[4761]: I1201 10:35:39.328750 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 10:35:40 crc kubenswrapper[4761]: I1201 10:35:40.405212 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:35:43 crc kubenswrapper[4761]: I1201 10:35:43.945192 4761 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:35:43 crc kubenswrapper[4761]: I1201 10:35:43.945414 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25" gracePeriod=5 Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.519801 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.522667 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.688888 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.689046 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.689679 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.689786 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.689718 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.689862 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.689933 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.690053 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.690236 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.690779 4761 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.690813 4761 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.690830 4761 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.690843 4761 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.701659 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.791827 4761 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.996427 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.996485 4761 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25" exitCode=137 Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.996529 4761 scope.go:117] "RemoveContainer" containerID="60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25" Dec 01 10:35:49 crc kubenswrapper[4761]: I1201 10:35:49.996691 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:50 crc kubenswrapper[4761]: I1201 10:35:50.023781 4761 scope.go:117] "RemoveContainer" containerID="60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25" Dec 01 10:35:50 crc kubenswrapper[4761]: E1201 10:35:50.024308 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25\": container with ID starting with 60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25 not found: ID does not exist" containerID="60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25" Dec 01 10:35:50 crc kubenswrapper[4761]: I1201 10:35:50.024388 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25"} err="failed to get container status \"60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25\": rpc error: code = NotFound desc = could not find container \"60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25\": container with ID starting with 60c3cf5f2b6424ebd6094e2f8fc214e77287236161372e9ace7b33fec29bae25 not found: ID does not exist" Dec 01 10:35:51 crc kubenswrapper[4761]: I1201 10:35:51.138227 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 10:35:51 crc kubenswrapper[4761]: I1201 10:35:51.138895 4761 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 01 10:35:51 crc kubenswrapper[4761]: I1201 10:35:51.150582 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:35:51 crc kubenswrapper[4761]: I1201 10:35:51.150614 4761 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4bf6e326-bf4b-488a-9afa-4170aa687cc0" Dec 01 10:35:51 crc kubenswrapper[4761]: I1201 10:35:51.155296 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:35:51 crc kubenswrapper[4761]: I1201 10:35:51.155354 4761 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4bf6e326-bf4b-488a-9afa-4170aa687cc0" Dec 01 10:35:59 crc kubenswrapper[4761]: I1201 10:35:59.065348 4761 generic.go:334] "Generic (PLEG): container finished" podID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerID="0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e" exitCode=0 Dec 01 10:35:59 crc kubenswrapper[4761]: I1201 10:35:59.065454 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" event={"ID":"7886c492-0b69-4cb1-aef7-08e7e482bc6a","Type":"ContainerDied","Data":"0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e"} Dec 01 10:35:59 crc kubenswrapper[4761]: I1201 10:35:59.066513 4761 scope.go:117] "RemoveContainer" containerID="0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e" Dec 01 10:36:00 crc kubenswrapper[4761]: I1201 10:36:00.077540 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" event={"ID":"7886c492-0b69-4cb1-aef7-08e7e482bc6a","Type":"ContainerStarted","Data":"187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312"} Dec 01 10:36:00 crc kubenswrapper[4761]: I1201 10:36:00.077943 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:36:00 crc kubenswrapper[4761]: I1201 10:36:00.079739 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.057839 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7m7cn"] Dec 01 10:36:23 crc kubenswrapper[4761]: E1201 10:36:23.058632 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.058646 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:36:23 crc kubenswrapper[4761]: E1201 10:36:23.058660 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" containerName="installer" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.058669 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" containerName="installer" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.058784 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="57880e23-a871-459f-8c11-2d59e61e2eaf" containerName="installer" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.058798 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.059199 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.080302 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7m7cn"] Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.241446 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.241502 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-registry-certificates\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.241525 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-trusted-ca\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.241561 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.241593 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.241611 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6snjb\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-kube-api-access-6snjb\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.241626 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-bound-sa-token\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.241727 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-registry-tls\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.272174 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.342429 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.342526 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-registry-certificates\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.342579 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-trusted-ca\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.342602 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.342630 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6snjb\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-kube-api-access-6snjb\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.342649 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-bound-sa-token\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.342672 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-registry-tls\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.343288 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.344217 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-trusted-ca\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.345201 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-registry-certificates\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.348072 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.348464 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-registry-tls\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.361709 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-bound-sa-token\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.375092 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6snjb\" (UniqueName: \"kubernetes.io/projected/aa3fdffb-be78-4e2d-9813-a78e36b3e2ae-kube-api-access-6snjb\") pod \"image-registry-66df7c8f76-7m7cn\" (UID: \"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae\") " pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:23 crc kubenswrapper[4761]: I1201 10:36:23.673946 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:24 crc kubenswrapper[4761]: I1201 10:36:24.146653 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7m7cn"] Dec 01 10:36:24 crc kubenswrapper[4761]: I1201 10:36:24.274880 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" event={"ID":"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae","Type":"ContainerStarted","Data":"a713369ae7f3c78792585f9270abbb9e09607899d755132e8bdb29ed6686620d"} Dec 01 10:36:25 crc kubenswrapper[4761]: I1201 10:36:25.282173 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" event={"ID":"aa3fdffb-be78-4e2d-9813-a78e36b3e2ae","Type":"ContainerStarted","Data":"f5af5bd97dcc03ca37264b0963cbd7b171519b432238ac6736c9858746bd0c6a"} Dec 01 10:36:25 crc kubenswrapper[4761]: I1201 10:36:25.282591 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:25 crc kubenswrapper[4761]: I1201 10:36:25.311306 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" podStartSLOduration=2.311267689 podStartE2EDuration="2.311267689s" podCreationTimestamp="2025-12-01 10:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:36:25.307670925 +0000 UTC m=+324.611429569" watchObservedRunningTime="2025-12-01 10:36:25.311267689 +0000 UTC m=+324.615026353" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.242084 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nztd7"] Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.242752 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nztd7" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerName="registry-server" containerID="cri-o://bc6e4a41ee9fa99f3cb30dbae761dbe61705bc0d840abe2adc038c1ed1fb799c" gracePeriod=30 Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.259455 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d99sk"] Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.259790 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d99sk" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerName="registry-server" containerID="cri-o://c37465957bcbbab7f933eeb3b4f186c26f6625eb3b4e196ebf0a05d05579ca70" gracePeriod=30 Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.269253 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2xx98"] Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.269584 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" containerID="cri-o://187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312" gracePeriod=30 Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.281646 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvkf2"] Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.281965 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fvkf2" podUID="0e51c452-5010-4af5-bb69-941565926337" containerName="registry-server" containerID="cri-o://01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984" gracePeriod=30 Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.287641 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lgbgw"] Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.294765 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.296765 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-55clp"] Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.296982 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-55clp" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerName="registry-server" containerID="cri-o://c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd" gracePeriod=30 Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.311039 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lgbgw"] Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.382609 4761 generic.go:334] "Generic (PLEG): container finished" podID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerID="bc6e4a41ee9fa99f3cb30dbae761dbe61705bc0d840abe2adc038c1ed1fb799c" exitCode=0 Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.382678 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nztd7" event={"ID":"3fe88ace-f487-4b05-a9de-d5bdd2945c75","Type":"ContainerDied","Data":"bc6e4a41ee9fa99f3cb30dbae761dbe61705bc0d840abe2adc038c1ed1fb799c"} Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.384813 4761 generic.go:334] "Generic (PLEG): container finished" podID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerID="c37465957bcbbab7f933eeb3b4f186c26f6625eb3b4e196ebf0a05d05579ca70" exitCode=0 Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.384860 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d99sk" event={"ID":"1e69dab2-4c11-4352-95c8-92499a4c5a75","Type":"ContainerDied","Data":"c37465957bcbbab7f933eeb3b4f186c26f6625eb3b4e196ebf0a05d05579ca70"} Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.479918 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/04df1b9e-01cf-41e0-af31-dcb2e0512d45-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.480071 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04df1b9e-01cf-41e0-af31-dcb2e0512d45-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.480140 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nst74\" (UniqueName: \"kubernetes.io/projected/04df1b9e-01cf-41e0-af31-dcb2e0512d45-kube-api-access-nst74\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.581354 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nst74\" (UniqueName: \"kubernetes.io/projected/04df1b9e-01cf-41e0-af31-dcb2e0512d45-kube-api-access-nst74\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.581409 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/04df1b9e-01cf-41e0-af31-dcb2e0512d45-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.581454 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04df1b9e-01cf-41e0-af31-dcb2e0512d45-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.582719 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04df1b9e-01cf-41e0-af31-dcb2e0512d45-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.589721 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/04df1b9e-01cf-41e0-af31-dcb2e0512d45-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.598120 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nst74\" (UniqueName: \"kubernetes.io/projected/04df1b9e-01cf-41e0-af31-dcb2e0512d45-kube-api-access-nst74\") pod \"marketplace-operator-79b997595-lgbgw\" (UID: \"04df1b9e-01cf-41e0-af31-dcb2e0512d45\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.611184 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.719814 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.784267 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-utilities\") pod \"6500351a-78de-4cb9-bc74-12a450bbc76e\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.784318 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldm8d\" (UniqueName: \"kubernetes.io/projected/6500351a-78de-4cb9-bc74-12a450bbc76e-kube-api-access-ldm8d\") pod \"6500351a-78de-4cb9-bc74-12a450bbc76e\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.784369 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-catalog-content\") pod \"6500351a-78de-4cb9-bc74-12a450bbc76e\" (UID: \"6500351a-78de-4cb9-bc74-12a450bbc76e\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.785531 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-utilities" (OuterVolumeSpecName: "utilities") pod "6500351a-78de-4cb9-bc74-12a450bbc76e" (UID: "6500351a-78de-4cb9-bc74-12a450bbc76e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.792880 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6500351a-78de-4cb9-bc74-12a450bbc76e-kube-api-access-ldm8d" (OuterVolumeSpecName: "kube-api-access-ldm8d") pod "6500351a-78de-4cb9-bc74-12a450bbc76e" (UID: "6500351a-78de-4cb9-bc74-12a450bbc76e"). InnerVolumeSpecName "kube-api-access-ldm8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.858970 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.871706 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.878948 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885186 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnctx\" (UniqueName: \"kubernetes.io/projected/7886c492-0b69-4cb1-aef7-08e7e482bc6a-kube-api-access-hnctx\") pod \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885260 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-utilities\") pod \"0e51c452-5010-4af5-bb69-941565926337\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885285 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-operator-metrics\") pod \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885307 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42248\" (UniqueName: \"kubernetes.io/projected/1e69dab2-4c11-4352-95c8-92499a4c5a75-kube-api-access-42248\") pod \"1e69dab2-4c11-4352-95c8-92499a4c5a75\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885328 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqdsj\" (UniqueName: \"kubernetes.io/projected/0e51c452-5010-4af5-bb69-941565926337-kube-api-access-vqdsj\") pod \"0e51c452-5010-4af5-bb69-941565926337\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885353 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-catalog-content\") pod \"0e51c452-5010-4af5-bb69-941565926337\" (UID: \"0e51c452-5010-4af5-bb69-941565926337\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885378 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-trusted-ca\") pod \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\" (UID: \"7886c492-0b69-4cb1-aef7-08e7e482bc6a\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885410 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-catalog-content\") pod \"1e69dab2-4c11-4352-95c8-92499a4c5a75\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885425 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-utilities\") pod \"1e69dab2-4c11-4352-95c8-92499a4c5a75\" (UID: \"1e69dab2-4c11-4352-95c8-92499a4c5a75\") " Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885610 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.885622 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldm8d\" (UniqueName: \"kubernetes.io/projected/6500351a-78de-4cb9-bc74-12a450bbc76e-kube-api-access-ldm8d\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.887085 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-utilities" (OuterVolumeSpecName: "utilities") pod "0e51c452-5010-4af5-bb69-941565926337" (UID: "0e51c452-5010-4af5-bb69-941565926337"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.887947 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7886c492-0b69-4cb1-aef7-08e7e482bc6a" (UID: "7886c492-0b69-4cb1-aef7-08e7e482bc6a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.888747 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-utilities" (OuterVolumeSpecName: "utilities") pod "1e69dab2-4c11-4352-95c8-92499a4c5a75" (UID: "1e69dab2-4c11-4352-95c8-92499a4c5a75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.889425 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e69dab2-4c11-4352-95c8-92499a4c5a75-kube-api-access-42248" (OuterVolumeSpecName: "kube-api-access-42248") pod "1e69dab2-4c11-4352-95c8-92499a4c5a75" (UID: "1e69dab2-4c11-4352-95c8-92499a4c5a75"). InnerVolumeSpecName "kube-api-access-42248". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.891359 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e51c452-5010-4af5-bb69-941565926337-kube-api-access-vqdsj" (OuterVolumeSpecName: "kube-api-access-vqdsj") pod "0e51c452-5010-4af5-bb69-941565926337" (UID: "0e51c452-5010-4af5-bb69-941565926337"). InnerVolumeSpecName "kube-api-access-vqdsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.892961 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.895064 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7886c492-0b69-4cb1-aef7-08e7e482bc6a" (UID: "7886c492-0b69-4cb1-aef7-08e7e482bc6a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.895445 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7886c492-0b69-4cb1-aef7-08e7e482bc6a-kube-api-access-hnctx" (OuterVolumeSpecName: "kube-api-access-hnctx") pod "7886c492-0b69-4cb1-aef7-08e7e482bc6a" (UID: "7886c492-0b69-4cb1-aef7-08e7e482bc6a"). InnerVolumeSpecName "kube-api-access-hnctx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.930997 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e51c452-5010-4af5-bb69-941565926337" (UID: "0e51c452-5010-4af5-bb69-941565926337"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.972883 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6500351a-78de-4cb9-bc74-12a450bbc76e" (UID: "6500351a-78de-4cb9-bc74-12a450bbc76e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.976904 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e69dab2-4c11-4352-95c8-92499a4c5a75" (UID: "1e69dab2-4c11-4352-95c8-92499a4c5a75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986768 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnctx\" (UniqueName: \"kubernetes.io/projected/7886c492-0b69-4cb1-aef7-08e7e482bc6a-kube-api-access-hnctx\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986800 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986813 4761 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986825 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42248\" (UniqueName: \"kubernetes.io/projected/1e69dab2-4c11-4352-95c8-92499a4c5a75-kube-api-access-42248\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986841 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqdsj\" (UniqueName: \"kubernetes.io/projected/0e51c452-5010-4af5-bb69-941565926337-kube-api-access-vqdsj\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986855 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e51c452-5010-4af5-bb69-941565926337-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986866 4761 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7886c492-0b69-4cb1-aef7-08e7e482bc6a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986878 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6500351a-78de-4cb9-bc74-12a450bbc76e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986886 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:42 crc kubenswrapper[4761]: I1201 10:36:42.986894 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e69dab2-4c11-4352-95c8-92499a4c5a75-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.088222 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-utilities\") pod \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.088442 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktwjw\" (UniqueName: \"kubernetes.io/projected/3fe88ace-f487-4b05-a9de-d5bdd2945c75-kube-api-access-ktwjw\") pod \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.088616 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-catalog-content\") pod \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\" (UID: \"3fe88ace-f487-4b05-a9de-d5bdd2945c75\") " Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.089457 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-utilities" (OuterVolumeSpecName: "utilities") pod "3fe88ace-f487-4b05-a9de-d5bdd2945c75" (UID: "3fe88ace-f487-4b05-a9de-d5bdd2945c75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.091829 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe88ace-f487-4b05-a9de-d5bdd2945c75-kube-api-access-ktwjw" (OuterVolumeSpecName: "kube-api-access-ktwjw") pod "3fe88ace-f487-4b05-a9de-d5bdd2945c75" (UID: "3fe88ace-f487-4b05-a9de-d5bdd2945c75"). InnerVolumeSpecName "kube-api-access-ktwjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.146430 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lgbgw"] Dec 01 10:36:43 crc kubenswrapper[4761]: W1201 10:36:43.164671 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04df1b9e_01cf_41e0_af31_dcb2e0512d45.slice/crio-5e2ce56f09ba43b77b977944a61e39718843b4137b7263b8339142887da02696 WatchSource:0}: Error finding container 5e2ce56f09ba43b77b977944a61e39718843b4137b7263b8339142887da02696: Status 404 returned error can't find the container with id 5e2ce56f09ba43b77b977944a61e39718843b4137b7263b8339142887da02696 Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.179132 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fe88ace-f487-4b05-a9de-d5bdd2945c75" (UID: "3fe88ace-f487-4b05-a9de-d5bdd2945c75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.190000 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.190233 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe88ace-f487-4b05-a9de-d5bdd2945c75-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.190315 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktwjw\" (UniqueName: \"kubernetes.io/projected/3fe88ace-f487-4b05-a9de-d5bdd2945c75-kube-api-access-ktwjw\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.397014 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nztd7" event={"ID":"3fe88ace-f487-4b05-a9de-d5bdd2945c75","Type":"ContainerDied","Data":"33bb00899968ff73a467bd2fc6f2f8bc5ba037aa9871c2382ea83ceb944cb130"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.397369 4761 scope.go:117] "RemoveContainer" containerID="bc6e4a41ee9fa99f3cb30dbae761dbe61705bc0d840abe2adc038c1ed1fb799c" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.397057 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nztd7" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.401785 4761 generic.go:334] "Generic (PLEG): container finished" podID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerID="c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd" exitCode=0 Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.401883 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55clp" event={"ID":"6500351a-78de-4cb9-bc74-12a450bbc76e","Type":"ContainerDied","Data":"c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.401934 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55clp" event={"ID":"6500351a-78de-4cb9-bc74-12a450bbc76e","Type":"ContainerDied","Data":"00c3d5d746c5cff512d6ef28384c5e8380e7ad7e5c1e68f896d756a425a50ec8"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.401860 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55clp" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.405488 4761 generic.go:334] "Generic (PLEG): container finished" podID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerID="187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312" exitCode=0 Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.405646 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.405874 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" event={"ID":"7886c492-0b69-4cb1-aef7-08e7e482bc6a","Type":"ContainerDied","Data":"187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.405970 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2xx98" event={"ID":"7886c492-0b69-4cb1-aef7-08e7e482bc6a","Type":"ContainerDied","Data":"8086c75beb2d4d1c763b580c8b1e5e41fe96f1fba3767e3e4325a29e489c508b"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.409809 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d99sk" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.409804 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d99sk" event={"ID":"1e69dab2-4c11-4352-95c8-92499a4c5a75","Type":"ContainerDied","Data":"ba5c15a400a6d4aed00da4d3c2b64b13f6aab31e49790a40f6b59b4e5595b686"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.414806 4761 scope.go:117] "RemoveContainer" containerID="0c15fff14c4d45ab4037666824245d1dc5a49cc9c4bdd229320c571aae6fb389" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.415363 4761 generic.go:334] "Generic (PLEG): container finished" podID="0e51c452-5010-4af5-bb69-941565926337" containerID="01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984" exitCode=0 Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.415412 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvkf2" event={"ID":"0e51c452-5010-4af5-bb69-941565926337","Type":"ContainerDied","Data":"01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.415462 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvkf2" event={"ID":"0e51c452-5010-4af5-bb69-941565926337","Type":"ContainerDied","Data":"abdd31e9a06bc11898da71133f323843ae38c9eb2dbe6a7242575de4417f25eb"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.415431 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvkf2" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.419010 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" event={"ID":"04df1b9e-01cf-41e0-af31-dcb2e0512d45","Type":"ContainerStarted","Data":"8fe217b35921f1d622a5e412cbe6601f930cceaf400f0bc01f2e7e9963edb47f"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.419070 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" event={"ID":"04df1b9e-01cf-41e0-af31-dcb2e0512d45","Type":"ContainerStarted","Data":"5e2ce56f09ba43b77b977944a61e39718843b4137b7263b8339142887da02696"} Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.419265 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.420495 4761 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lgbgw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.64:8080/healthz\": dial tcp 10.217.0.64:8080: connect: connection refused" start-of-body= Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.420530 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" podUID="04df1b9e-01cf-41e0-af31-dcb2e0512d45" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.64:8080/healthz\": dial tcp 10.217.0.64:8080: connect: connection refused" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.427158 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-55clp"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.435790 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-55clp"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.450541 4761 scope.go:117] "RemoveContainer" containerID="52a8b80f6539ce43d2586d052feb8e21ce9e877b977d2af82cf6c3fcc96780f5" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.451482 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2xx98"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.456835 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2xx98"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.462096 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvkf2"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.467076 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvkf2"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.469345 4761 scope.go:117] "RemoveContainer" containerID="c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.475561 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d99sk"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.479640 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d99sk"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.484928 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nztd7"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.492568 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nztd7"] Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.493405 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" podStartSLOduration=1.493380856 podStartE2EDuration="1.493380856s" podCreationTimestamp="2025-12-01 10:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:36:43.485715808 +0000 UTC m=+342.789474422" watchObservedRunningTime="2025-12-01 10:36:43.493380856 +0000 UTC m=+342.797139490" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.497870 4761 scope.go:117] "RemoveContainer" containerID="c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.514356 4761 scope.go:117] "RemoveContainer" containerID="cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.529139 4761 scope.go:117] "RemoveContainer" containerID="c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd" Dec 01 10:36:43 crc kubenswrapper[4761]: E1201 10:36:43.529620 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd\": container with ID starting with c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd not found: ID does not exist" containerID="c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.529658 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd"} err="failed to get container status \"c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd\": rpc error: code = NotFound desc = could not find container \"c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd\": container with ID starting with c4f64399b07273403273b30dc9baee368fd71c74c414319ec67f166e919356dd not found: ID does not exist" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.529682 4761 scope.go:117] "RemoveContainer" containerID="c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc" Dec 01 10:36:43 crc kubenswrapper[4761]: E1201 10:36:43.530322 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc\": container with ID starting with c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc not found: ID does not exist" containerID="c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.530360 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc"} err="failed to get container status \"c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc\": rpc error: code = NotFound desc = could not find container \"c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc\": container with ID starting with c181d36cc3f5ef6cb83beb6bc213c56ab5c3ed9ef2d509a6dbfc3888bd9214cc not found: ID does not exist" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.530405 4761 scope.go:117] "RemoveContainer" containerID="cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce" Dec 01 10:36:43 crc kubenswrapper[4761]: E1201 10:36:43.530975 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce\": container with ID starting with cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce not found: ID does not exist" containerID="cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.531012 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce"} err="failed to get container status \"cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce\": rpc error: code = NotFound desc = could not find container \"cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce\": container with ID starting with cf3f49d5e4968db414fd8d782e90fa54aedfd7bf5fbd450ae811114e921876ce not found: ID does not exist" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.531025 4761 scope.go:117] "RemoveContainer" containerID="187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.544221 4761 scope.go:117] "RemoveContainer" containerID="0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.557349 4761 scope.go:117] "RemoveContainer" containerID="187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312" Dec 01 10:36:43 crc kubenswrapper[4761]: E1201 10:36:43.557686 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312\": container with ID starting with 187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312 not found: ID does not exist" containerID="187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.557733 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312"} err="failed to get container status \"187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312\": rpc error: code = NotFound desc = could not find container \"187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312\": container with ID starting with 187f581da6a5e4d7a6348be40869f3444e8fffb7bc654888264b5fd294ae8312 not found: ID does not exist" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.557766 4761 scope.go:117] "RemoveContainer" containerID="0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e" Dec 01 10:36:43 crc kubenswrapper[4761]: E1201 10:36:43.558106 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e\": container with ID starting with 0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e not found: ID does not exist" containerID="0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.558130 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e"} err="failed to get container status \"0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e\": rpc error: code = NotFound desc = could not find container \"0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e\": container with ID starting with 0269ef10d4add0bd3821958b59d6129152a0655244f29efeb23dec8f18a1294e not found: ID does not exist" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.558146 4761 scope.go:117] "RemoveContainer" containerID="c37465957bcbbab7f933eeb3b4f186c26f6625eb3b4e196ebf0a05d05579ca70" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.572680 4761 scope.go:117] "RemoveContainer" containerID="409b1b1f9063bd31e3060195fe09e467c62898ce54b54b239a459b30bde9e8fd" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.587429 4761 scope.go:117] "RemoveContainer" containerID="5fd1fad99bb3a8f8e25ec56d46863c20df66ec101b60c4f8ec0d16dc8b55bbe8" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.600609 4761 scope.go:117] "RemoveContainer" containerID="01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.612165 4761 scope.go:117] "RemoveContainer" containerID="8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.626383 4761 scope.go:117] "RemoveContainer" containerID="4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.645665 4761 scope.go:117] "RemoveContainer" containerID="01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984" Dec 01 10:36:43 crc kubenswrapper[4761]: E1201 10:36:43.646082 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984\": container with ID starting with 01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984 not found: ID does not exist" containerID="01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.646121 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984"} err="failed to get container status \"01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984\": rpc error: code = NotFound desc = could not find container \"01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984\": container with ID starting with 01c4764cb7148a4e69a898139e44b534742459aba2d6f12b526751b218016984 not found: ID does not exist" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.646155 4761 scope.go:117] "RemoveContainer" containerID="8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6" Dec 01 10:36:43 crc kubenswrapper[4761]: E1201 10:36:43.646399 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6\": container with ID starting with 8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6 not found: ID does not exist" containerID="8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.646427 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6"} err="failed to get container status \"8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6\": rpc error: code = NotFound desc = could not find container \"8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6\": container with ID starting with 8a1ce8c5d9f95f3bd7e497b642bfa09e47d25fbdfd5e0b2d38857166260217e6 not found: ID does not exist" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.646443 4761 scope.go:117] "RemoveContainer" containerID="4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f" Dec 01 10:36:43 crc kubenswrapper[4761]: E1201 10:36:43.646707 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f\": container with ID starting with 4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f not found: ID does not exist" containerID="4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.646736 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f"} err="failed to get container status \"4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f\": rpc error: code = NotFound desc = could not find container \"4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f\": container with ID starting with 4d3b850a3e508109af077aa0cd57b5a762e4ffeb3d2ff4cc1f55714f0c40d83f not found: ID does not exist" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.678954 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7m7cn" Dec 01 10:36:43 crc kubenswrapper[4761]: I1201 10:36:43.747160 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5s745"] Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.436088 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lgbgw" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665219 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8jdwr"] Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665738 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665774 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665789 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665795 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665808 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665814 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665831 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerName="extract-utilities" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665852 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerName="extract-utilities" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665863 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerName="extract-utilities" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665869 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerName="extract-utilities" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665879 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerName="extract-content" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665887 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerName="extract-content" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665900 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e51c452-5010-4af5-bb69-941565926337" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665906 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e51c452-5010-4af5-bb69-941565926337" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665913 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e51c452-5010-4af5-bb69-941565926337" containerName="extract-utilities" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665935 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e51c452-5010-4af5-bb69-941565926337" containerName="extract-utilities" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665951 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665957 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665969 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.665975 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.665988 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e51c452-5010-4af5-bb69-941565926337" containerName="extract-content" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666015 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e51c452-5010-4af5-bb69-941565926337" containerName="extract-content" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.666024 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerName="extract-utilities" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666030 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerName="extract-utilities" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.666043 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerName="extract-content" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666049 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerName="extract-content" Dec 01 10:36:44 crc kubenswrapper[4761]: E1201 10:36:44.666059 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerName="extract-content" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666065 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerName="extract-content" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666274 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666284 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666296 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e51c452-5010-4af5-bb69-941565926337" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666324 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.666333 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" containerName="registry-server" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.667161 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" containerName="marketplace-operator" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.668072 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.672137 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.677138 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jdwr"] Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.811158 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6zsz\" (UniqueName: \"kubernetes.io/projected/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-kube-api-access-t6zsz\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.811222 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-utilities\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.811254 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-catalog-content\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.858045 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b5vk2"] Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.859223 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.861137 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.868696 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5vk2"] Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.912749 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6zsz\" (UniqueName: \"kubernetes.io/projected/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-kube-api-access-t6zsz\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.912813 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-utilities\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.912839 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-catalog-content\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.913294 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-utilities\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.913377 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-catalog-content\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.931537 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6zsz\" (UniqueName: \"kubernetes.io/projected/687031d5-0ddd-4dee-a39a-9b0a3a32bf69-kube-api-access-t6zsz\") pod \"redhat-marketplace-8jdwr\" (UID: \"687031d5-0ddd-4dee-a39a-9b0a3a32bf69\") " pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:44 crc kubenswrapper[4761]: I1201 10:36:44.984593 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.013651 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnz56\" (UniqueName: \"kubernetes.io/projected/4b483973-7f6c-4581-b676-d19f25446c7a-kube-api-access-lnz56\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.013879 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b483973-7f6c-4581-b676-d19f25446c7a-utilities\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.014003 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b483973-7f6c-4581-b676-d19f25446c7a-catalog-content\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.118230 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b483973-7f6c-4581-b676-d19f25446c7a-utilities\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.118664 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b483973-7f6c-4581-b676-d19f25446c7a-catalog-content\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.118722 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnz56\" (UniqueName: \"kubernetes.io/projected/4b483973-7f6c-4581-b676-d19f25446c7a-kube-api-access-lnz56\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.118846 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b483973-7f6c-4581-b676-d19f25446c7a-utilities\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.119112 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b483973-7f6c-4581-b676-d19f25446c7a-catalog-content\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.136769 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e51c452-5010-4af5-bb69-941565926337" path="/var/lib/kubelet/pods/0e51c452-5010-4af5-bb69-941565926337/volumes" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.137780 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e69dab2-4c11-4352-95c8-92499a4c5a75" path="/var/lib/kubelet/pods/1e69dab2-4c11-4352-95c8-92499a4c5a75/volumes" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.139055 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fe88ace-f487-4b05-a9de-d5bdd2945c75" path="/var/lib/kubelet/pods/3fe88ace-f487-4b05-a9de-d5bdd2945c75/volumes" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.140256 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6500351a-78de-4cb9-bc74-12a450bbc76e" path="/var/lib/kubelet/pods/6500351a-78de-4cb9-bc74-12a450bbc76e/volumes" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.141391 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7886c492-0b69-4cb1-aef7-08e7e482bc6a" path="/var/lib/kubelet/pods/7886c492-0b69-4cb1-aef7-08e7e482bc6a/volumes" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.161612 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnz56\" (UniqueName: \"kubernetes.io/projected/4b483973-7f6c-4581-b676-d19f25446c7a-kube-api-access-lnz56\") pod \"certified-operators-b5vk2\" (UID: \"4b483973-7f6c-4581-b676-d19f25446c7a\") " pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.173190 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.370673 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jdwr"] Dec 01 10:36:45 crc kubenswrapper[4761]: W1201 10:36:45.378620 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687031d5_0ddd_4dee_a39a_9b0a3a32bf69.slice/crio-701918fb1165e4aff5a1545f8dbc2caa3648f83fd693d3566b9755cfff7b9726 WatchSource:0}: Error finding container 701918fb1165e4aff5a1545f8dbc2caa3648f83fd693d3566b9755cfff7b9726: Status 404 returned error can't find the container with id 701918fb1165e4aff5a1545f8dbc2caa3648f83fd693d3566b9755cfff7b9726 Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.438171 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jdwr" event={"ID":"687031d5-0ddd-4dee-a39a-9b0a3a32bf69","Type":"ContainerStarted","Data":"701918fb1165e4aff5a1545f8dbc2caa3648f83fd693d3566b9755cfff7b9726"} Dec 01 10:36:45 crc kubenswrapper[4761]: I1201 10:36:45.565377 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5vk2"] Dec 01 10:36:45 crc kubenswrapper[4761]: W1201 10:36:45.570216 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b483973_7f6c_4581_b676_d19f25446c7a.slice/crio-f59e89b9afb6f3d735ef692a10171cd00a0a589affcb2a77f96b11c75e66e3e0 WatchSource:0}: Error finding container f59e89b9afb6f3d735ef692a10171cd00a0a589affcb2a77f96b11c75e66e3e0: Status 404 returned error can't find the container with id f59e89b9afb6f3d735ef692a10171cd00a0a589affcb2a77f96b11c75e66e3e0 Dec 01 10:36:46 crc kubenswrapper[4761]: I1201 10:36:46.444835 4761 generic.go:334] "Generic (PLEG): container finished" podID="687031d5-0ddd-4dee-a39a-9b0a3a32bf69" containerID="03514210d3dad16ee58793f13d4a6de94ee8b88618f246accab0778a4e2fef2e" exitCode=0 Dec 01 10:36:46 crc kubenswrapper[4761]: I1201 10:36:46.444918 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jdwr" event={"ID":"687031d5-0ddd-4dee-a39a-9b0a3a32bf69","Type":"ContainerDied","Data":"03514210d3dad16ee58793f13d4a6de94ee8b88618f246accab0778a4e2fef2e"} Dec 01 10:36:46 crc kubenswrapper[4761]: I1201 10:36:46.447925 4761 generic.go:334] "Generic (PLEG): container finished" podID="4b483973-7f6c-4581-b676-d19f25446c7a" containerID="c63152c0863e9a54e02219e9e2d6ff5e8a540560f08018ffde68ce39f32cbd62" exitCode=0 Dec 01 10:36:46 crc kubenswrapper[4761]: I1201 10:36:46.447970 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5vk2" event={"ID":"4b483973-7f6c-4581-b676-d19f25446c7a","Type":"ContainerDied","Data":"c63152c0863e9a54e02219e9e2d6ff5e8a540560f08018ffde68ce39f32cbd62"} Dec 01 10:36:46 crc kubenswrapper[4761]: I1201 10:36:46.448015 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5vk2" event={"ID":"4b483973-7f6c-4581-b676-d19f25446c7a","Type":"ContainerStarted","Data":"f59e89b9afb6f3d735ef692a10171cd00a0a589affcb2a77f96b11c75e66e3e0"} Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.064278 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p6sdd"] Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.065369 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.068898 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.077725 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6sdd"] Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.246644 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-catalog-content\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.246684 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9krgl\" (UniqueName: \"kubernetes.io/projected/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-kube-api-access-9krgl\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.246721 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-utilities\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.257063 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5vnxn"] Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.257990 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.259669 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.265342 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5vnxn"] Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.347798 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-catalog-content\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.347844 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9krgl\" (UniqueName: \"kubernetes.io/projected/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-kube-api-access-9krgl\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.347898 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-utilities\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.348455 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-utilities\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.348573 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-catalog-content\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.373910 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9krgl\" (UniqueName: \"kubernetes.io/projected/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-kube-api-access-9krgl\") pod \"community-operators-p6sdd\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.380507 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.448676 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-utilities\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.448991 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7spmj\" (UniqueName: \"kubernetes.io/projected/1396c61b-86e8-41e3-90d3-76c88b8c7994-kube-api-access-7spmj\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.449059 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-catalog-content\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.549998 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-utilities\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.550071 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7spmj\" (UniqueName: \"kubernetes.io/projected/1396c61b-86e8-41e3-90d3-76c88b8c7994-kube-api-access-7spmj\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.550101 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-catalog-content\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.550656 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-utilities\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.550670 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-catalog-content\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.574935 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spmj\" (UniqueName: \"kubernetes.io/projected/1396c61b-86e8-41e3-90d3-76c88b8c7994-kube-api-access-7spmj\") pod \"redhat-operators-5vnxn\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.772466 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6sdd"] Dec 01 10:36:47 crc kubenswrapper[4761]: I1201 10:36:47.871612 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:48 crc kubenswrapper[4761]: I1201 10:36:48.465651 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6sdd" event={"ID":"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b","Type":"ContainerStarted","Data":"10ee87e594d46230670ea9b2af7c2c6d8c8615b8c05ba68af1c664d47108abc6"} Dec 01 10:36:48 crc kubenswrapper[4761]: I1201 10:36:48.708693 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5vnxn"] Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.474618 4761 generic.go:334] "Generic (PLEG): container finished" podID="d5c8ad76-1c9b-4463-84cd-9b4501f80f8b" containerID="a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717" exitCode=0 Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.474712 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6sdd" event={"ID":"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b","Type":"ContainerDied","Data":"a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717"} Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.482874 4761 generic.go:334] "Generic (PLEG): container finished" podID="687031d5-0ddd-4dee-a39a-9b0a3a32bf69" containerID="85079563fe5ad7298527af2926d002fcde6412ebb4b68b6fcb4fa83a95040472" exitCode=0 Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.482999 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jdwr" event={"ID":"687031d5-0ddd-4dee-a39a-9b0a3a32bf69","Type":"ContainerDied","Data":"85079563fe5ad7298527af2926d002fcde6412ebb4b68b6fcb4fa83a95040472"} Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.485577 4761 generic.go:334] "Generic (PLEG): container finished" podID="4b483973-7f6c-4581-b676-d19f25446c7a" containerID="9c655c6608d9018bc625c18bdceffd3f329d977b6ed722b30cd6f2fbd5f8cc09" exitCode=0 Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.485690 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5vk2" event={"ID":"4b483973-7f6c-4581-b676-d19f25446c7a","Type":"ContainerDied","Data":"9c655c6608d9018bc625c18bdceffd3f329d977b6ed722b30cd6f2fbd5f8cc09"} Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.488994 4761 generic.go:334] "Generic (PLEG): container finished" podID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerID="076f9017946141c281694a50fabe1c861187e3fb7ef656149f8c412d1f88dedb" exitCode=0 Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.489302 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vnxn" event={"ID":"1396c61b-86e8-41e3-90d3-76c88b8c7994","Type":"ContainerDied","Data":"076f9017946141c281694a50fabe1c861187e3fb7ef656149f8c412d1f88dedb"} Dec 01 10:36:49 crc kubenswrapper[4761]: I1201 10:36:49.489362 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vnxn" event={"ID":"1396c61b-86e8-41e3-90d3-76c88b8c7994","Type":"ContainerStarted","Data":"71209cbc57e363e89f17f72ddf7531c295813ee2eb643fdc0e3b34423a9ab277"} Dec 01 10:36:51 crc kubenswrapper[4761]: I1201 10:36:51.506956 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5vk2" event={"ID":"4b483973-7f6c-4581-b676-d19f25446c7a","Type":"ContainerStarted","Data":"ab4780d2fec420620cb377ae92995c344a2a7da3aa009ef55d088b3125d22bc3"} Dec 01 10:36:51 crc kubenswrapper[4761]: I1201 10:36:51.514865 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vnxn" event={"ID":"1396c61b-86e8-41e3-90d3-76c88b8c7994","Type":"ContainerStarted","Data":"ea75f85d815a0374a9115194268c0ee30a88a20fafcb63d33532ab2641d976e8"} Dec 01 10:36:51 crc kubenswrapper[4761]: I1201 10:36:51.518405 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6sdd" event={"ID":"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b","Type":"ContainerStarted","Data":"d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b"} Dec 01 10:36:51 crc kubenswrapper[4761]: I1201 10:36:51.521785 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jdwr" event={"ID":"687031d5-0ddd-4dee-a39a-9b0a3a32bf69","Type":"ContainerStarted","Data":"99939879651b63367a7d2bcd05e3c42f7f9c32952fb6b8c9b2bd59397fd8476f"} Dec 01 10:36:51 crc kubenswrapper[4761]: I1201 10:36:51.537637 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b5vk2" podStartSLOduration=3.655165194 podStartE2EDuration="7.537616916s" podCreationTimestamp="2025-12-01 10:36:44 +0000 UTC" firstStartedPulling="2025-12-01 10:36:46.449716049 +0000 UTC m=+345.753474673" lastFinishedPulling="2025-12-01 10:36:50.332167771 +0000 UTC m=+349.635926395" observedRunningTime="2025-12-01 10:36:51.522107317 +0000 UTC m=+350.825865951" watchObservedRunningTime="2025-12-01 10:36:51.537616916 +0000 UTC m=+350.841375540" Dec 01 10:36:51 crc kubenswrapper[4761]: I1201 10:36:51.550357 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8jdwr" podStartSLOduration=3.586556689 podStartE2EDuration="7.550327428s" podCreationTimestamp="2025-12-01 10:36:44 +0000 UTC" firstStartedPulling="2025-12-01 10:36:46.446736577 +0000 UTC m=+345.750495201" lastFinishedPulling="2025-12-01 10:36:50.410507296 +0000 UTC m=+349.714265940" observedRunningTime="2025-12-01 10:36:51.546745984 +0000 UTC m=+350.850504618" watchObservedRunningTime="2025-12-01 10:36:51.550327428 +0000 UTC m=+350.854086052" Dec 01 10:36:52 crc kubenswrapper[4761]: I1201 10:36:52.529797 4761 generic.go:334] "Generic (PLEG): container finished" podID="d5c8ad76-1c9b-4463-84cd-9b4501f80f8b" containerID="d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b" exitCode=0 Dec 01 10:36:52 crc kubenswrapper[4761]: I1201 10:36:52.529914 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6sdd" event={"ID":"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b","Type":"ContainerDied","Data":"d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b"} Dec 01 10:36:52 crc kubenswrapper[4761]: I1201 10:36:52.530118 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6sdd" event={"ID":"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b","Type":"ContainerStarted","Data":"d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b"} Dec 01 10:36:52 crc kubenswrapper[4761]: I1201 10:36:52.531873 4761 generic.go:334] "Generic (PLEG): container finished" podID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerID="ea75f85d815a0374a9115194268c0ee30a88a20fafcb63d33532ab2641d976e8" exitCode=0 Dec 01 10:36:52 crc kubenswrapper[4761]: I1201 10:36:52.531989 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vnxn" event={"ID":"1396c61b-86e8-41e3-90d3-76c88b8c7994","Type":"ContainerDied","Data":"ea75f85d815a0374a9115194268c0ee30a88a20fafcb63d33532ab2641d976e8"} Dec 01 10:36:52 crc kubenswrapper[4761]: I1201 10:36:52.550050 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p6sdd" podStartSLOduration=3.018715551 podStartE2EDuration="5.550025704s" podCreationTimestamp="2025-12-01 10:36:47 +0000 UTC" firstStartedPulling="2025-12-01 10:36:49.496214071 +0000 UTC m=+348.799972695" lastFinishedPulling="2025-12-01 10:36:52.027524224 +0000 UTC m=+351.331282848" observedRunningTime="2025-12-01 10:36:52.545439689 +0000 UTC m=+351.849198313" watchObservedRunningTime="2025-12-01 10:36:52.550025704 +0000 UTC m=+351.853784348" Dec 01 10:36:53 crc kubenswrapper[4761]: I1201 10:36:53.544691 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vnxn" event={"ID":"1396c61b-86e8-41e3-90d3-76c88b8c7994","Type":"ContainerStarted","Data":"d318677f31f66e22c59e62bda306a046e2af3e5d3fb1833607b0887536ae339e"} Dec 01 10:36:54 crc kubenswrapper[4761]: I1201 10:36:54.985470 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:54 crc kubenswrapper[4761]: I1201 10:36:54.986726 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:55 crc kubenswrapper[4761]: I1201 10:36:55.049878 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:36:55 crc kubenswrapper[4761]: I1201 10:36:55.073089 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5vnxn" podStartSLOduration=4.353813258 podStartE2EDuration="8.073069257s" podCreationTimestamp="2025-12-01 10:36:47 +0000 UTC" firstStartedPulling="2025-12-01 10:36:49.508835271 +0000 UTC m=+348.812593895" lastFinishedPulling="2025-12-01 10:36:53.22809126 +0000 UTC m=+352.531849894" observedRunningTime="2025-12-01 10:36:53.564908192 +0000 UTC m=+352.868666816" watchObservedRunningTime="2025-12-01 10:36:55.073069257 +0000 UTC m=+354.376827881" Dec 01 10:36:55 crc kubenswrapper[4761]: I1201 10:36:55.173652 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:55 crc kubenswrapper[4761]: I1201 10:36:55.173716 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:55 crc kubenswrapper[4761]: I1201 10:36:55.229308 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:36:57 crc kubenswrapper[4761]: I1201 10:36:57.380919 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:57 crc kubenswrapper[4761]: I1201 10:36:57.381004 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:57 crc kubenswrapper[4761]: I1201 10:36:57.428137 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:57 crc kubenswrapper[4761]: I1201 10:36:57.605738 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p6sdd" Dec 01 10:36:57 crc kubenswrapper[4761]: I1201 10:36:57.898310 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:57 crc kubenswrapper[4761]: I1201 10:36:57.898355 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:36:58 crc kubenswrapper[4761]: I1201 10:36:58.935874 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5vnxn" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="registry-server" probeResult="failure" output=< Dec 01 10:36:58 crc kubenswrapper[4761]: timeout: failed to connect service ":50051" within 1s Dec 01 10:36:58 crc kubenswrapper[4761]: > Dec 01 10:37:03 crc kubenswrapper[4761]: I1201 10:37:03.850682 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:37:03 crc kubenswrapper[4761]: I1201 10:37:03.851027 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:37:05 crc kubenswrapper[4761]: I1201 10:37:05.030984 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8jdwr" Dec 01 10:37:05 crc kubenswrapper[4761]: I1201 10:37:05.229961 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b5vk2" Dec 01 10:37:07 crc kubenswrapper[4761]: I1201 10:37:07.913408 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:37:07 crc kubenswrapper[4761]: I1201 10:37:07.957799 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:37:08 crc kubenswrapper[4761]: I1201 10:37:08.783386 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" podUID="c5615f9d-052a-4910-8050-d39d2d9dde06" containerName="registry" containerID="cri-o://764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924" gracePeriod=30 Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.199277 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.354715 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5615f9d-052a-4910-8050-d39d2d9dde06-ca-trust-extracted\") pod \"c5615f9d-052a-4910-8050-d39d2d9dde06\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.354773 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-trusted-ca\") pod \"c5615f9d-052a-4910-8050-d39d2d9dde06\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.354812 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-tls\") pod \"c5615f9d-052a-4910-8050-d39d2d9dde06\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.355011 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c5615f9d-052a-4910-8050-d39d2d9dde06\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.355033 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-bound-sa-token\") pod \"c5615f9d-052a-4910-8050-d39d2d9dde06\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.355624 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c5615f9d-052a-4910-8050-d39d2d9dde06" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.355096 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmjt5\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-kube-api-access-kmjt5\") pod \"c5615f9d-052a-4910-8050-d39d2d9dde06\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.355749 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5615f9d-052a-4910-8050-d39d2d9dde06-installation-pull-secrets\") pod \"c5615f9d-052a-4910-8050-d39d2d9dde06\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.355767 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-certificates\") pod \"c5615f9d-052a-4910-8050-d39d2d9dde06\" (UID: \"c5615f9d-052a-4910-8050-d39d2d9dde06\") " Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.355977 4761 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.356346 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c5615f9d-052a-4910-8050-d39d2d9dde06" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.363990 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c5615f9d-052a-4910-8050-d39d2d9dde06" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.364427 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-kube-api-access-kmjt5" (OuterVolumeSpecName: "kube-api-access-kmjt5") pod "c5615f9d-052a-4910-8050-d39d2d9dde06" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06"). InnerVolumeSpecName "kube-api-access-kmjt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.364813 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c5615f9d-052a-4910-8050-d39d2d9dde06" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.375772 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5615f9d-052a-4910-8050-d39d2d9dde06-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c5615f9d-052a-4910-8050-d39d2d9dde06" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.379214 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5615f9d-052a-4910-8050-d39d2d9dde06-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c5615f9d-052a-4910-8050-d39d2d9dde06" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.380582 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c5615f9d-052a-4910-8050-d39d2d9dde06" (UID: "c5615f9d-052a-4910-8050-d39d2d9dde06"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.457533 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmjt5\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-kube-api-access-kmjt5\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.457592 4761 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5615f9d-052a-4910-8050-d39d2d9dde06-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.457604 4761 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.457616 4761 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5615f9d-052a-4910-8050-d39d2d9dde06-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.457630 4761 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.457643 4761 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5615f9d-052a-4910-8050-d39d2d9dde06-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.633815 4761 generic.go:334] "Generic (PLEG): container finished" podID="c5615f9d-052a-4910-8050-d39d2d9dde06" containerID="764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924" exitCode=0 Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.633859 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" event={"ID":"c5615f9d-052a-4910-8050-d39d2d9dde06","Type":"ContainerDied","Data":"764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924"} Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.633888 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" event={"ID":"c5615f9d-052a-4910-8050-d39d2d9dde06","Type":"ContainerDied","Data":"a427a7efb09d8c84ec4889f6be235894b93b36fe41394281857409e960e9ece5"} Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.633909 4761 scope.go:117] "RemoveContainer" containerID="764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.634025 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5s745" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.661641 4761 scope.go:117] "RemoveContainer" containerID="764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924" Dec 01 10:37:09 crc kubenswrapper[4761]: E1201 10:37:09.662039 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924\": container with ID starting with 764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924 not found: ID does not exist" containerID="764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.662070 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924"} err="failed to get container status \"764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924\": rpc error: code = NotFound desc = could not find container \"764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924\": container with ID starting with 764c7f547151e34c58701c30592e176fbaeb189d7b0f76bb221b0007248d3924 not found: ID does not exist" Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.669107 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5s745"] Dec 01 10:37:09 crc kubenswrapper[4761]: I1201 10:37:09.672414 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5s745"] Dec 01 10:37:11 crc kubenswrapper[4761]: I1201 10:37:11.137772 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5615f9d-052a-4910-8050-d39d2d9dde06" path="/var/lib/kubelet/pods/c5615f9d-052a-4910-8050-d39d2d9dde06/volumes" Dec 01 10:37:33 crc kubenswrapper[4761]: I1201 10:37:33.850000 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:37:33 crc kubenswrapper[4761]: I1201 10:37:33.850673 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:38:03 crc kubenswrapper[4761]: I1201 10:38:03.849884 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:38:03 crc kubenswrapper[4761]: I1201 10:38:03.850390 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:38:03 crc kubenswrapper[4761]: I1201 10:38:03.850448 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:38:03 crc kubenswrapper[4761]: I1201 10:38:03.851059 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2d1f25511ab54e969e2db56032fd59e29dcd744fd868077745072a36be032ba"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:38:03 crc kubenswrapper[4761]: I1201 10:38:03.851124 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://c2d1f25511ab54e969e2db56032fd59e29dcd744fd868077745072a36be032ba" gracePeriod=600 Dec 01 10:38:04 crc kubenswrapper[4761]: I1201 10:38:04.943475 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="c2d1f25511ab54e969e2db56032fd59e29dcd744fd868077745072a36be032ba" exitCode=0 Dec 01 10:38:04 crc kubenswrapper[4761]: I1201 10:38:04.943611 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"c2d1f25511ab54e969e2db56032fd59e29dcd744fd868077745072a36be032ba"} Dec 01 10:38:04 crc kubenswrapper[4761]: I1201 10:38:04.943928 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"158963cf7c7332677495f8902b02e1b832dfd26ac99929eec34f87750405cba2"} Dec 01 10:38:04 crc kubenswrapper[4761]: I1201 10:38:04.943951 4761 scope.go:117] "RemoveContainer" containerID="eaefda698fb6d6a59562f9e31cdbfb638985f057569d01a0b0d9d620bdae39e4" Dec 01 10:40:33 crc kubenswrapper[4761]: I1201 10:40:33.850012 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:40:33 crc kubenswrapper[4761]: I1201 10:40:33.851615 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:41:03 crc kubenswrapper[4761]: I1201 10:41:03.850312 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:41:03 crc kubenswrapper[4761]: I1201 10:41:03.850873 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:41:33 crc kubenswrapper[4761]: I1201 10:41:33.850746 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:41:33 crc kubenswrapper[4761]: I1201 10:41:33.851677 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:41:33 crc kubenswrapper[4761]: I1201 10:41:33.851773 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:41:33 crc kubenswrapper[4761]: I1201 10:41:33.852905 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"158963cf7c7332677495f8902b02e1b832dfd26ac99929eec34f87750405cba2"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:41:33 crc kubenswrapper[4761]: I1201 10:41:33.853036 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://158963cf7c7332677495f8902b02e1b832dfd26ac99929eec34f87750405cba2" gracePeriod=600 Dec 01 10:41:34 crc kubenswrapper[4761]: I1201 10:41:34.248537 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="158963cf7c7332677495f8902b02e1b832dfd26ac99929eec34f87750405cba2" exitCode=0 Dec 01 10:41:34 crc kubenswrapper[4761]: I1201 10:41:34.248617 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"158963cf7c7332677495f8902b02e1b832dfd26ac99929eec34f87750405cba2"} Dec 01 10:41:34 crc kubenswrapper[4761]: I1201 10:41:34.248652 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"3eb417125a9051f5c4c312a6fe5fbfd28525e926ddf81a026e3b1bb704152208"} Dec 01 10:41:34 crc kubenswrapper[4761]: I1201 10:41:34.248677 4761 scope.go:117] "RemoveContainer" containerID="c2d1f25511ab54e969e2db56032fd59e29dcd744fd868077745072a36be032ba" Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.675327 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pllhm"] Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.676524 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovn-controller" containerID="cri-o://e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc" gracePeriod=30 Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.676633 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="nbdb" containerID="cri-o://7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09" gracePeriod=30 Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.676675 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9" gracePeriod=30 Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.676755 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kube-rbac-proxy-node" containerID="cri-o://cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6" gracePeriod=30 Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.676702 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovn-acl-logging" containerID="cri-o://2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5" gracePeriod=30 Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.676889 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="northd" containerID="cri-o://57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c" gracePeriod=30 Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.676982 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="sbdb" containerID="cri-o://7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600" gracePeriod=30 Dec 01 10:42:10 crc kubenswrapper[4761]: I1201 10:42:10.756411 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" containerID="cri-o://66d185ea008facfbf66c4693ed2abbb7d581c51a627a47074fc8cc3a1292b153" gracePeriod=30 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.490639 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/2.log" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.491899 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/1.log" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.491955 4761 generic.go:334] "Generic (PLEG): container finished" podID="7a9149d7-77b0-4df1-8d1a-5a94ef00463a" containerID="5d5ba0b4c00a761700fbb26c07c77c1fefe4b5b54df3f83e70592beb830196eb" exitCode=2 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.492032 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nz6qt" event={"ID":"7a9149d7-77b0-4df1-8d1a-5a94ef00463a","Type":"ContainerDied","Data":"5d5ba0b4c00a761700fbb26c07c77c1fefe4b5b54df3f83e70592beb830196eb"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.492085 4761 scope.go:117] "RemoveContainer" containerID="4e948041f57df5a0935e30229e3d340f05630f051c7e6c5cc1976b58d8788128" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.493023 4761 scope.go:117] "RemoveContainer" containerID="5d5ba0b4c00a761700fbb26c07c77c1fefe4b5b54df3f83e70592beb830196eb" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.493383 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nz6qt_openshift-multus(7a9149d7-77b0-4df1-8d1a-5a94ef00463a)\"" pod="openshift-multus/multus-nz6qt" podUID="7a9149d7-77b0-4df1-8d1a-5a94ef00463a" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.497734 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovnkube-controller/3.log" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.501217 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovn-acl-logging/0.log" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.502353 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovn-controller/0.log" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503110 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="66d185ea008facfbf66c4693ed2abbb7d581c51a627a47074fc8cc3a1292b153" exitCode=0 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503156 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600" exitCode=0 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503177 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09" exitCode=0 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503201 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c" exitCode=0 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503219 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9" exitCode=0 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503239 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6" exitCode=0 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503260 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5" exitCode=143 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503277 4761 generic.go:334] "Generic (PLEG): container finished" podID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerID="e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc" exitCode=143 Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503319 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"66d185ea008facfbf66c4693ed2abbb7d581c51a627a47074fc8cc3a1292b153"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503372 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503402 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503426 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503451 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503475 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503499 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.503523 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc"} Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.518972 4761 scope.go:117] "RemoveContainer" containerID="f401083c228e35bfa2a09875efc811ca222426f1b925e202d2453703d216aa30" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.543976 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovn-acl-logging/0.log" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.544837 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovn-controller/0.log" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.545295 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610005 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fx85t"] Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610277 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610304 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610323 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610337 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610353 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovn-acl-logging" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610366 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovn-acl-logging" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610406 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="northd" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610418 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="northd" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610437 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kube-rbac-proxy-node" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610449 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kube-rbac-proxy-node" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610464 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610475 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610490 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovn-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610502 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovn-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610517 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610532 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610570 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kubecfg-setup" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610583 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kubecfg-setup" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610605 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5615f9d-052a-4910-8050-d39d2d9dde06" containerName="registry" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610617 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5615f9d-052a-4910-8050-d39d2d9dde06" containerName="registry" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610630 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="nbdb" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610642 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="nbdb" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610655 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610668 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.610684 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="sbdb" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610696 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="sbdb" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610845 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610861 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kube-rbac-proxy-node" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610877 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610892 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovn-acl-logging" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610906 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="nbdb" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610926 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610938 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5615f9d-052a-4910-8050-d39d2d9dde06" containerName="registry" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610952 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="sbdb" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610967 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610979 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.610995 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="northd" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.611019 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovn-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: E1201 10:42:11.611182 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.611196 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.611357 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" containerName="ovnkube-controller" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.613708 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632683 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-openvswitch\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632718 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-netns\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632746 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-script-lib\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632774 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-ovn\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632794 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-config\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632819 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96n87\" (UniqueName: \"kubernetes.io/projected/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-kube-api-access-96n87\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632836 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-bin\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632831 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632874 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-etc-openvswitch\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632849 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632898 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632944 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-ovn-kubernetes\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633008 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-log-socket\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632913 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633034 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-env-overrides\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633069 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-var-lib-openvswitch\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633093 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovn-node-metrics-cert\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633109 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-netd\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632929 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633146 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-slash\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.632973 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633055 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-log-socket" (OuterVolumeSpecName: "log-socket") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633131 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633175 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633160 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-kubelet\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633201 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633219 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-systemd\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633235 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-systemd-units\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633272 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-node-log\") pod \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\" (UID: \"463dbf7c-b2d9-4f91-819c-f74a30d5d01b\") " Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633309 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633340 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-slash" (OuterVolumeSpecName: "host-slash") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633428 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-etc-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633475 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-log-socket\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633500 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-slash\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633519 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-ovn\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633536 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-run-netns\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633567 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-systemd\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633583 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovnkube-config\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633583 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633596 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633615 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovnkube-script-lib\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633639 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-systemd-units\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633653 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggrb9\" (UniqueName: \"kubernetes.io/projected/4a8b6434-3718-44f2-afb5-230ffd2b7857-kube-api-access-ggrb9\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633673 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-env-overrides\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633692 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-kubelet\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633707 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-var-lib-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633723 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-run-ovn-kubernetes\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633737 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-cni-bin\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633754 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-cni-netd\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633770 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633775 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633810 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-node-log\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633828 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovn-node-metrics-cert\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633861 4761 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633808 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633841 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633859 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-node-log" (OuterVolumeSpecName: "node-log") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633944 4761 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633963 4761 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633977 4761 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.633990 4761 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634002 4761 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634013 4761 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634024 4761 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634035 4761 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634047 4761 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634058 4761 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634069 4761 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634080 4761 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.634540 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.638844 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-kube-api-access-96n87" (OuterVolumeSpecName: "kube-api-access-96n87") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "kube-api-access-96n87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.644607 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.654661 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "463dbf7c-b2d9-4f91-819c-f74a30d5d01b" (UID: "463dbf7c-b2d9-4f91-819c-f74a30d5d01b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.734315 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-node-log\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.734395 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-node-log\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.734440 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovn-node-metrics-cert\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.734461 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-etc-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.734592 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-etc-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735294 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-log-socket\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735328 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-slash\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735348 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-ovn\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735364 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-run-netns\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735369 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-log-socket\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735379 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovnkube-config\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735402 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-systemd\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735419 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735438 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovnkube-script-lib\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735441 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-ovn\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735461 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-systemd-units\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735471 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-slash\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735509 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-systemd\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735480 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrb9\" (UniqueName: \"kubernetes.io/projected/4a8b6434-3718-44f2-afb5-230ffd2b7857-kube-api-access-ggrb9\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735515 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-run-netns\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735587 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-systemd-units\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735488 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-run-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735615 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-env-overrides\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735642 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-kubelet\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735669 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-var-lib-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735687 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-run-ovn-kubernetes\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735705 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-cni-bin\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735708 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-kubelet\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735724 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-cni-netd\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735768 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735781 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-var-lib-openvswitch\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735781 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-cni-bin\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735854 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-cni-netd\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735853 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735901 4761 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735879 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a8b6434-3718-44f2-afb5-230ffd2b7857-host-run-ovn-kubernetes\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735940 4761 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.735989 4761 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.736014 4761 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.736025 4761 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.736036 4761 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.736045 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96n87\" (UniqueName: \"kubernetes.io/projected/463dbf7c-b2d9-4f91-819c-f74a30d5d01b-kube-api-access-96n87\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.736148 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovnkube-script-lib\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.736148 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovnkube-config\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.736179 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a8b6434-3718-44f2-afb5-230ffd2b7857-env-overrides\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.738131 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a8b6434-3718-44f2-afb5-230ffd2b7857-ovn-node-metrics-cert\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.751007 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggrb9\" (UniqueName: \"kubernetes.io/projected/4a8b6434-3718-44f2-afb5-230ffd2b7857-kube-api-access-ggrb9\") pod \"ovnkube-node-fx85t\" (UID: \"4a8b6434-3718-44f2-afb5-230ffd2b7857\") " pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:11 crc kubenswrapper[4761]: I1201 10:42:11.931221 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.511187 4761 generic.go:334] "Generic (PLEG): container finished" podID="4a8b6434-3718-44f2-afb5-230ffd2b7857" containerID="e2bec158c44979bb791b96ee6c62e7c94afb85f1c71f06203d64492eb1518534" exitCode=0 Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.511247 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerDied","Data":"e2bec158c44979bb791b96ee6c62e7c94afb85f1c71f06203d64492eb1518534"} Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.511839 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"1b1f8751c89e77a161b3309b5bbc4e3bf7b8b15576342ebd58b0ed044e8c372f"} Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.519395 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovn-acl-logging/0.log" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.520011 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pllhm_463dbf7c-b2d9-4f91-819c-f74a30d5d01b/ovn-controller/0.log" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.520488 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" event={"ID":"463dbf7c-b2d9-4f91-819c-f74a30d5d01b","Type":"ContainerDied","Data":"a824dc72377a6db821ea40beed6150d7a255b974a9baeddea434ee4b93b58e9e"} Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.520572 4761 scope.go:117] "RemoveContainer" containerID="66d185ea008facfbf66c4693ed2abbb7d581c51a627a47074fc8cc3a1292b153" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.520746 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pllhm" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.522757 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/2.log" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.553437 4761 scope.go:117] "RemoveContainer" containerID="7db1443a59aa5ad96b47ed4959a4f240212a9fffcfbb8215473b946a8214c600" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.618126 4761 scope.go:117] "RemoveContainer" containerID="7d9391ae64da759a4de660084a05ce3387a5428c178e6829feb2023da2019d09" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.622855 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pllhm"] Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.627092 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pllhm"] Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.635981 4761 scope.go:117] "RemoveContainer" containerID="57d18f5633fc286a7de5a5395033a1e23954b899196a6aed478133819cfbfe0c" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.652193 4761 scope.go:117] "RemoveContainer" containerID="793532eadd7f659a4c5d4379bdac0c9819398a940709aa151a57e5f129a7b0c9" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.665872 4761 scope.go:117] "RemoveContainer" containerID="cf5fc50d45bf1bf0fe058a9a1680eee87e86ae479efc36a8c65d62e1b1768ee6" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.680323 4761 scope.go:117] "RemoveContainer" containerID="2512fced7d9f7385d6b72809303261c052770944bae5e2978b92f615b815b2f5" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.697179 4761 scope.go:117] "RemoveContainer" containerID="e22ad1014ea5ff365859bd787d6974c51788432eb264110285b16ad1c712b6bc" Dec 01 10:42:12 crc kubenswrapper[4761]: I1201 10:42:12.718538 4761 scope.go:117] "RemoveContainer" containerID="97bbe19b4c05b68a61810d2ad58ac55c5ca52703ec3c3bd39567a614889947fb" Dec 01 10:42:13 crc kubenswrapper[4761]: I1201 10:42:13.134789 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463dbf7c-b2d9-4f91-819c-f74a30d5d01b" path="/var/lib/kubelet/pods/463dbf7c-b2d9-4f91-819c-f74a30d5d01b/volumes" Dec 01 10:42:13 crc kubenswrapper[4761]: I1201 10:42:13.537013 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"f1db5ca93bd9674ae643c21c18463249e52ffdd011817c7dbad356955f3c7aa5"} Dec 01 10:42:13 crc kubenswrapper[4761]: I1201 10:42:13.537342 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"0f4df5e4cad9e26dd4ee9755a5e78e20d78a2beed5fd00ba0f44ee0c42cc8aeb"} Dec 01 10:42:13 crc kubenswrapper[4761]: I1201 10:42:13.537518 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"3557a46f6970bc51d21895467702dfe68b2ddff6be394b5d9650008cd22b3eea"} Dec 01 10:42:13 crc kubenswrapper[4761]: I1201 10:42:13.537877 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"2c52104338bc712e1202f547b496ba43e7602c943dc31961f7f206ea9cc4e1c9"} Dec 01 10:42:13 crc kubenswrapper[4761]: I1201 10:42:13.538099 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"469831becfb32336eab258ac8343811dd4643d6384866122c18ed0a274d27c3e"} Dec 01 10:42:13 crc kubenswrapper[4761]: I1201 10:42:13.538259 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"4bd0040456d550f4d277289bc0d80efbae4a4679e6140652a1aa895c9352cbb1"} Dec 01 10:42:15 crc kubenswrapper[4761]: I1201 10:42:15.561667 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"5255720c402eabba79fbc58ad934e3c05738ee610cd721de5248b43eb612a5b8"} Dec 01 10:42:18 crc kubenswrapper[4761]: I1201 10:42:18.585822 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" event={"ID":"4a8b6434-3718-44f2-afb5-230ffd2b7857","Type":"ContainerStarted","Data":"5b18893aa96cac8c67ab5cc48bcbb071c829ba9d08df4a46bbb6bd89ad9eacb5"} Dec 01 10:42:18 crc kubenswrapper[4761]: I1201 10:42:18.586270 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:18 crc kubenswrapper[4761]: I1201 10:42:18.586393 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:18 crc kubenswrapper[4761]: I1201 10:42:18.586449 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:18 crc kubenswrapper[4761]: I1201 10:42:18.618372 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:18 crc kubenswrapper[4761]: I1201 10:42:18.625351 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:18 crc kubenswrapper[4761]: I1201 10:42:18.630210 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" podStartSLOduration=7.630189376 podStartE2EDuration="7.630189376s" podCreationTimestamp="2025-12-01 10:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:42:18.620212407 +0000 UTC m=+677.923971041" watchObservedRunningTime="2025-12-01 10:42:18.630189376 +0000 UTC m=+677.933948010" Dec 01 10:42:25 crc kubenswrapper[4761]: I1201 10:42:25.128890 4761 scope.go:117] "RemoveContainer" containerID="5d5ba0b4c00a761700fbb26c07c77c1fefe4b5b54df3f83e70592beb830196eb" Dec 01 10:42:25 crc kubenswrapper[4761]: E1201 10:42:25.129975 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nz6qt_openshift-multus(7a9149d7-77b0-4df1-8d1a-5a94ef00463a)\"" pod="openshift-multus/multus-nz6qt" podUID="7a9149d7-77b0-4df1-8d1a-5a94ef00463a" Dec 01 10:42:38 crc kubenswrapper[4761]: I1201 10:42:38.128257 4761 scope.go:117] "RemoveContainer" containerID="5d5ba0b4c00a761700fbb26c07c77c1fefe4b5b54df3f83e70592beb830196eb" Dec 01 10:42:38 crc kubenswrapper[4761]: I1201 10:42:38.718629 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nz6qt_7a9149d7-77b0-4df1-8d1a-5a94ef00463a/kube-multus/2.log" Dec 01 10:42:38 crc kubenswrapper[4761]: I1201 10:42:38.719020 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nz6qt" event={"ID":"7a9149d7-77b0-4df1-8d1a-5a94ef00463a","Type":"ContainerStarted","Data":"5983c3689d7ff968d9deb8aa680d599f48aec37715b208434034a1cdaa323996"} Dec 01 10:42:42 crc kubenswrapper[4761]: I1201 10:42:42.006189 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fx85t" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.646922 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk"] Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.649989 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.653540 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.654046 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk"] Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.786920 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.786982 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4r66\" (UniqueName: \"kubernetes.io/projected/9a577166-579f-48b6-92c0-39505fdf48f5-kube-api-access-c4r66\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.787141 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.888025 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.888062 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4r66\" (UniqueName: \"kubernetes.io/projected/9a577166-579f-48b6-92c0-39505fdf48f5-kube-api-access-c4r66\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.888113 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.888477 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.888543 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.910899 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4r66\" (UniqueName: \"kubernetes.io/projected/9a577166-579f-48b6-92c0-39505fdf48f5-kube-api-access-c4r66\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:45 crc kubenswrapper[4761]: I1201 10:42:45.967366 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:46 crc kubenswrapper[4761]: I1201 10:42:46.186149 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk"] Dec 01 10:42:46 crc kubenswrapper[4761]: W1201 10:42:46.199065 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a577166_579f_48b6_92c0_39505fdf48f5.slice/crio-4c973fbecd1ec9dc72620e71d0e62b5d4253f9cc30a23ae1f8accfbc6d5ed3a8 WatchSource:0}: Error finding container 4c973fbecd1ec9dc72620e71d0e62b5d4253f9cc30a23ae1f8accfbc6d5ed3a8: Status 404 returned error can't find the container with id 4c973fbecd1ec9dc72620e71d0e62b5d4253f9cc30a23ae1f8accfbc6d5ed3a8 Dec 01 10:42:46 crc kubenswrapper[4761]: I1201 10:42:46.762430 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" event={"ID":"9a577166-579f-48b6-92c0-39505fdf48f5","Type":"ContainerStarted","Data":"4c973fbecd1ec9dc72620e71d0e62b5d4253f9cc30a23ae1f8accfbc6d5ed3a8"} Dec 01 10:42:47 crc kubenswrapper[4761]: I1201 10:42:47.771259 4761 generic.go:334] "Generic (PLEG): container finished" podID="9a577166-579f-48b6-92c0-39505fdf48f5" containerID="03e43792f2620b919361a9312ad3d0e0f6b211d009fed281a9b2884e783198eb" exitCode=0 Dec 01 10:42:47 crc kubenswrapper[4761]: I1201 10:42:47.771372 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" event={"ID":"9a577166-579f-48b6-92c0-39505fdf48f5","Type":"ContainerDied","Data":"03e43792f2620b919361a9312ad3d0e0f6b211d009fed281a9b2884e783198eb"} Dec 01 10:42:47 crc kubenswrapper[4761]: I1201 10:42:47.775864 4761 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:42:49 crc kubenswrapper[4761]: E1201 10:42:49.978696 4761 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a577166_579f_48b6_92c0_39505fdf48f5.slice/crio-e206d95a536031a230c9069d08011c640a5eb1c47adb17ea6c2137ad1869fd90.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:42:50 crc kubenswrapper[4761]: I1201 10:42:50.795704 4761 generic.go:334] "Generic (PLEG): container finished" podID="9a577166-579f-48b6-92c0-39505fdf48f5" containerID="e206d95a536031a230c9069d08011c640a5eb1c47adb17ea6c2137ad1869fd90" exitCode=0 Dec 01 10:42:50 crc kubenswrapper[4761]: I1201 10:42:50.795788 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" event={"ID":"9a577166-579f-48b6-92c0-39505fdf48f5","Type":"ContainerDied","Data":"e206d95a536031a230c9069d08011c640a5eb1c47adb17ea6c2137ad1869fd90"} Dec 01 10:42:51 crc kubenswrapper[4761]: I1201 10:42:51.807309 4761 generic.go:334] "Generic (PLEG): container finished" podID="9a577166-579f-48b6-92c0-39505fdf48f5" containerID="300e40737367739022780229674d6d5322c10d7f14aee100e07c5c57434b0f9f" exitCode=0 Dec 01 10:42:51 crc kubenswrapper[4761]: I1201 10:42:51.807490 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" event={"ID":"9a577166-579f-48b6-92c0-39505fdf48f5","Type":"ContainerDied","Data":"300e40737367739022780229674d6d5322c10d7f14aee100e07c5c57434b0f9f"} Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.124351 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.298489 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4r66\" (UniqueName: \"kubernetes.io/projected/9a577166-579f-48b6-92c0-39505fdf48f5-kube-api-access-c4r66\") pod \"9a577166-579f-48b6-92c0-39505fdf48f5\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.298651 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-bundle\") pod \"9a577166-579f-48b6-92c0-39505fdf48f5\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.298699 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-util\") pod \"9a577166-579f-48b6-92c0-39505fdf48f5\" (UID: \"9a577166-579f-48b6-92c0-39505fdf48f5\") " Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.299774 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-bundle" (OuterVolumeSpecName: "bundle") pod "9a577166-579f-48b6-92c0-39505fdf48f5" (UID: "9a577166-579f-48b6-92c0-39505fdf48f5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.306198 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a577166-579f-48b6-92c0-39505fdf48f5-kube-api-access-c4r66" (OuterVolumeSpecName: "kube-api-access-c4r66") pod "9a577166-579f-48b6-92c0-39505fdf48f5" (UID: "9a577166-579f-48b6-92c0-39505fdf48f5"). InnerVolumeSpecName "kube-api-access-c4r66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.320025 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-util" (OuterVolumeSpecName: "util") pod "9a577166-579f-48b6-92c0-39505fdf48f5" (UID: "9a577166-579f-48b6-92c0-39505fdf48f5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.400458 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4r66\" (UniqueName: \"kubernetes.io/projected/9a577166-579f-48b6-92c0-39505fdf48f5-kube-api-access-c4r66\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.400524 4761 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.400544 4761 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a577166-579f-48b6-92c0-39505fdf48f5-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.822619 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" event={"ID":"9a577166-579f-48b6-92c0-39505fdf48f5","Type":"ContainerDied","Data":"4c973fbecd1ec9dc72620e71d0e62b5d4253f9cc30a23ae1f8accfbc6d5ed3a8"} Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.822661 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c973fbecd1ec9dc72620e71d0e62b5d4253f9cc30a23ae1f8accfbc6d5ed3a8" Dec 01 10:42:53 crc kubenswrapper[4761]: I1201 10:42:53.822755 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.748969 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4"] Dec 01 10:43:04 crc kubenswrapper[4761]: E1201 10:43:04.749961 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a577166-579f-48b6-92c0-39505fdf48f5" containerName="pull" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.749976 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a577166-579f-48b6-92c0-39505fdf48f5" containerName="pull" Dec 01 10:43:04 crc kubenswrapper[4761]: E1201 10:43:04.750011 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a577166-579f-48b6-92c0-39505fdf48f5" containerName="util" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.750020 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a577166-579f-48b6-92c0-39505fdf48f5" containerName="util" Dec 01 10:43:04 crc kubenswrapper[4761]: E1201 10:43:04.750041 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a577166-579f-48b6-92c0-39505fdf48f5" containerName="extract" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.750050 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a577166-579f-48b6-92c0-39505fdf48f5" containerName="extract" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.750291 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a577166-579f-48b6-92c0-39505fdf48f5" containerName="extract" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.750962 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.753114 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.753390 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-sb4xp" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.753482 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.754694 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.754883 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.778213 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4"] Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.841910 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt2mr\" (UniqueName: \"kubernetes.io/projected/e1506c16-7214-4b74-a6d5-935646d2bb83-kube-api-access-gt2mr\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.841993 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1506c16-7214-4b74-a6d5-935646d2bb83-webhook-cert\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.842024 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1506c16-7214-4b74-a6d5-935646d2bb83-apiservice-cert\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.943367 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1506c16-7214-4b74-a6d5-935646d2bb83-webhook-cert\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.943434 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1506c16-7214-4b74-a6d5-935646d2bb83-apiservice-cert\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.943467 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt2mr\" (UniqueName: \"kubernetes.io/projected/e1506c16-7214-4b74-a6d5-935646d2bb83-kube-api-access-gt2mr\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.964018 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1506c16-7214-4b74-a6d5-935646d2bb83-webhook-cert\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.964018 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1506c16-7214-4b74-a6d5-935646d2bb83-apiservice-cert\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:04 crc kubenswrapper[4761]: I1201 10:43:04.970582 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt2mr\" (UniqueName: \"kubernetes.io/projected/e1506c16-7214-4b74-a6d5-935646d2bb83-kube-api-access-gt2mr\") pod \"metallb-operator-controller-manager-66985c5f8b-b6zh4\" (UID: \"e1506c16-7214-4b74-a6d5-935646d2bb83\") " pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.064646 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d"] Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.065262 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.067414 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5frzv" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.070932 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.071365 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.075710 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.133729 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d"] Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.146357 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-apiservice-cert\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.146421 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkrmw\" (UniqueName: \"kubernetes.io/projected/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-kube-api-access-bkrmw\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.146456 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-webhook-cert\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.271741 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-apiservice-cert\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.272060 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkrmw\" (UniqueName: \"kubernetes.io/projected/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-kube-api-access-bkrmw\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.272087 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-webhook-cert\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.288240 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-apiservice-cert\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.288252 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-webhook-cert\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.294990 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkrmw\" (UniqueName: \"kubernetes.io/projected/cc5f6c3c-71a1-443c-9c3a-67fc2305dd62-kube-api-access-bkrmw\") pod \"metallb-operator-webhook-server-56bbcd747-q8n7d\" (UID: \"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62\") " pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.378589 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.392788 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4"] Dec 01 10:43:05 crc kubenswrapper[4761]: W1201 10:43:05.393597 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1506c16_7214_4b74_a6d5_935646d2bb83.slice/crio-ca03b8a1646347995021a5db8809b9eae34a65dc1888304a2e8123574f69b470 WatchSource:0}: Error finding container ca03b8a1646347995021a5db8809b9eae34a65dc1888304a2e8123574f69b470: Status 404 returned error can't find the container with id ca03b8a1646347995021a5db8809b9eae34a65dc1888304a2e8123574f69b470 Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.760199 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d"] Dec 01 10:43:05 crc kubenswrapper[4761]: W1201 10:43:05.771466 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc5f6c3c_71a1_443c_9c3a_67fc2305dd62.slice/crio-2cd79ae486b78b768e8c2500504ef728a100deab73ec354155ce5769e3c1a5f5 WatchSource:0}: Error finding container 2cd79ae486b78b768e8c2500504ef728a100deab73ec354155ce5769e3c1a5f5: Status 404 returned error can't find the container with id 2cd79ae486b78b768e8c2500504ef728a100deab73ec354155ce5769e3c1a5f5 Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.888692 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" event={"ID":"e1506c16-7214-4b74-a6d5-935646d2bb83","Type":"ContainerStarted","Data":"ca03b8a1646347995021a5db8809b9eae34a65dc1888304a2e8123574f69b470"} Dec 01 10:43:05 crc kubenswrapper[4761]: I1201 10:43:05.889495 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" event={"ID":"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62","Type":"ContainerStarted","Data":"2cd79ae486b78b768e8c2500504ef728a100deab73ec354155ce5769e3c1a5f5"} Dec 01 10:43:08 crc kubenswrapper[4761]: I1201 10:43:08.919659 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" event={"ID":"e1506c16-7214-4b74-a6d5-935646d2bb83","Type":"ContainerStarted","Data":"0991d2c49b78e4a61cdf8115b31f824c5783edf3d9c7bc952be20b30c39d6509"} Dec 01 10:43:08 crc kubenswrapper[4761]: I1201 10:43:08.919916 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:08 crc kubenswrapper[4761]: I1201 10:43:08.943523 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" podStartSLOduration=2.369505377 podStartE2EDuration="4.943500907s" podCreationTimestamp="2025-12-01 10:43:04 +0000 UTC" firstStartedPulling="2025-12-01 10:43:05.400146244 +0000 UTC m=+724.703904868" lastFinishedPulling="2025-12-01 10:43:07.974141774 +0000 UTC m=+727.277900398" observedRunningTime="2025-12-01 10:43:08.941320531 +0000 UTC m=+728.245079175" watchObservedRunningTime="2025-12-01 10:43:08.943500907 +0000 UTC m=+728.247259531" Dec 01 10:43:10 crc kubenswrapper[4761]: I1201 10:43:10.937698 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" event={"ID":"cc5f6c3c-71a1-443c-9c3a-67fc2305dd62","Type":"ContainerStarted","Data":"470de73909ed10604cbbe6df5bee9975da30d35d5ccb1a6f2c1939aa33d53af8"} Dec 01 10:43:10 crc kubenswrapper[4761]: I1201 10:43:10.938047 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:10 crc kubenswrapper[4761]: I1201 10:43:10.954405 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" podStartSLOduration=1.01289765 podStartE2EDuration="5.954387459s" podCreationTimestamp="2025-12-01 10:43:05 +0000 UTC" firstStartedPulling="2025-12-01 10:43:05.775204357 +0000 UTC m=+725.078962981" lastFinishedPulling="2025-12-01 10:43:10.716694166 +0000 UTC m=+730.020452790" observedRunningTime="2025-12-01 10:43:10.953059515 +0000 UTC m=+730.256818139" watchObservedRunningTime="2025-12-01 10:43:10.954387459 +0000 UTC m=+730.258146083" Dec 01 10:43:25 crc kubenswrapper[4761]: I1201 10:43:25.387721 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-56bbcd747-q8n7d" Dec 01 10:43:31 crc kubenswrapper[4761]: I1201 10:43:31.527382 4761 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 10:43:45 crc kubenswrapper[4761]: I1201 10:43:45.079007 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-66985c5f8b-b6zh4" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.052658 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-dtww8"] Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.055289 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.057188 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w"] Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.057617 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.057871 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.057917 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-r5qs4" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.059242 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.059433 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.093296 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w"] Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.108896 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-hqff8"] Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.109854 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.111688 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.111689 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.112408 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.120806 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2hhlm" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.125952 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-82n8s"] Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.127188 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.129232 4761 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.134933 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-82n8s"] Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.162833 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-t6f4w\" (UID: \"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.162874 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5js5l\" (UniqueName: \"kubernetes.io/projected/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-kube-api-access-5js5l\") pod \"frr-k8s-webhook-server-7fcb986d4-t6f4w\" (UID: \"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.162900 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-metrics\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.162928 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-frr-sockets\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.162948 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-frr-conf\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.162969 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4bdee341-d432-4260-8334-4c47aee1593a-frr-startup\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.162994 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68dg\" (UniqueName: \"kubernetes.io/projected/4bdee341-d432-4260-8334-4c47aee1593a-kube-api-access-f68dg\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.163018 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bdee341-d432-4260-8334-4c47aee1593a-metrics-certs\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.163033 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-reloader\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.264500 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztfjp\" (UniqueName: \"kubernetes.io/projected/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-kube-api-access-ztfjp\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.264744 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-cert\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.264868 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4bdee341-d432-4260-8334-4c47aee1593a-frr-startup\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.264985 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-845pj\" (UniqueName: \"kubernetes.io/projected/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-kube-api-access-845pj\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.265109 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68dg\" (UniqueName: \"kubernetes.io/projected/4bdee341-d432-4260-8334-4c47aee1593a-kube-api-access-f68dg\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.265208 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-metrics-certs\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.265341 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bdee341-d432-4260-8334-4c47aee1593a-metrics-certs\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.265436 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-reloader\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.265537 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-t6f4w\" (UID: \"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.265671 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5js5l\" (UniqueName: \"kubernetes.io/projected/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-kube-api-access-5js5l\") pod \"frr-k8s-webhook-server-7fcb986d4-t6f4w\" (UID: \"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.265767 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-metrics\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.265864 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-metrics-certs\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: E1201 10:43:46.266025 4761 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 01 10:43:46 crc kubenswrapper[4761]: E1201 10:43:46.266084 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-cert podName:d5fccf55-8452-4691-9d4b-d27b6c9e0a2f nodeName:}" failed. No retries permitted until 2025-12-01 10:43:46.766067878 +0000 UTC m=+766.069826502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-cert") pod "frr-k8s-webhook-server-7fcb986d4-t6f4w" (UID: "d5fccf55-8452-4691-9d4b-d27b6c9e0a2f") : secret "frr-k8s-webhook-server-cert" not found Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.266272 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4bdee341-d432-4260-8334-4c47aee1593a-frr-startup\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.266459 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-metrics\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.266598 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-metallb-excludel2\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.266631 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-frr-sockets\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.266653 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: E1201 10:43:46.266806 4761 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 01 10:43:46 crc kubenswrapper[4761]: E1201 10:43:46.266862 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdee341-d432-4260-8334-4c47aee1593a-metrics-certs podName:4bdee341-d432-4260-8334-4c47aee1593a nodeName:}" failed. No retries permitted until 2025-12-01 10:43:46.766843478 +0000 UTC m=+766.070602122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bdee341-d432-4260-8334-4c47aee1593a-metrics-certs") pod "frr-k8s-dtww8" (UID: "4bdee341-d432-4260-8334-4c47aee1593a") : secret "frr-k8s-certs-secret" not found Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.266890 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-reloader\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.267725 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-frr-conf\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.267830 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-frr-sockets\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.268120 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4bdee341-d432-4260-8334-4c47aee1593a-frr-conf\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.283385 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5js5l\" (UniqueName: \"kubernetes.io/projected/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-kube-api-access-5js5l\") pod \"frr-k8s-webhook-server-7fcb986d4-t6f4w\" (UID: \"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.284059 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68dg\" (UniqueName: \"kubernetes.io/projected/4bdee341-d432-4260-8334-4c47aee1593a-kube-api-access-f68dg\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.369507 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-metrics-certs\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.369840 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-metallb-excludel2\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.369969 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.370185 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztfjp\" (UniqueName: \"kubernetes.io/projected/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-kube-api-access-ztfjp\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.370593 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-cert\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.370729 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-845pj\" (UniqueName: \"kubernetes.io/projected/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-kube-api-access-845pj\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.370821 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-metrics-certs\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: E1201 10:43:46.370132 4761 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 10:43:46 crc kubenswrapper[4761]: E1201 10:43:46.371108 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist podName:d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c nodeName:}" failed. No retries permitted until 2025-12-01 10:43:46.871089959 +0000 UTC m=+766.174848593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist") pod "speaker-hqff8" (UID: "d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c") : secret "metallb-memberlist" not found Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.370750 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-metallb-excludel2\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.374938 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-cert\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.375031 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-metrics-certs\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.375234 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-metrics-certs\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.394981 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-845pj\" (UniqueName: \"kubernetes.io/projected/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-kube-api-access-845pj\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.396778 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztfjp\" (UniqueName: \"kubernetes.io/projected/93bcfa9d-c2bd-4d59-9be1-181d49ab1009-kube-api-access-ztfjp\") pod \"controller-f8648f98b-82n8s\" (UID: \"93bcfa9d-c2bd-4d59-9be1-181d49ab1009\") " pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.439803 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.752449 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-82n8s"] Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.777323 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bdee341-d432-4260-8334-4c47aee1593a-metrics-certs\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.777388 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-t6f4w\" (UID: \"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.782943 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bdee341-d432-4260-8334-4c47aee1593a-metrics-certs\") pod \"frr-k8s-dtww8\" (UID: \"4bdee341-d432-4260-8334-4c47aee1593a\") " pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.783568 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5fccf55-8452-4691-9d4b-d27b6c9e0a2f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-t6f4w\" (UID: \"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.878320 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:46 crc kubenswrapper[4761]: E1201 10:43:46.878474 4761 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 10:43:46 crc kubenswrapper[4761]: E1201 10:43:46.878537 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist podName:d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c nodeName:}" failed. No retries permitted until 2025-12-01 10:43:47.878521727 +0000 UTC m=+767.182280351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist") pod "speaker-hqff8" (UID: "d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c") : secret "metallb-memberlist" not found Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.974878 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dtww8" Dec 01 10:43:46 crc kubenswrapper[4761]: I1201 10:43:46.983667 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:47 crc kubenswrapper[4761]: I1201 10:43:47.184430 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w"] Dec 01 10:43:47 crc kubenswrapper[4761]: W1201 10:43:47.191038 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5fccf55_8452_4691_9d4b_d27b6c9e0a2f.slice/crio-1f8b1f674476846a13538ae7c508214775a82826e8c70b807afdb882e249d020 WatchSource:0}: Error finding container 1f8b1f674476846a13538ae7c508214775a82826e8c70b807afdb882e249d020: Status 404 returned error can't find the container with id 1f8b1f674476846a13538ae7c508214775a82826e8c70b807afdb882e249d020 Dec 01 10:43:47 crc kubenswrapper[4761]: I1201 10:43:47.232427 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" event={"ID":"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f","Type":"ContainerStarted","Data":"1f8b1f674476846a13538ae7c508214775a82826e8c70b807afdb882e249d020"} Dec 01 10:43:47 crc kubenswrapper[4761]: I1201 10:43:47.233494 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-82n8s" event={"ID":"93bcfa9d-c2bd-4d59-9be1-181d49ab1009","Type":"ContainerStarted","Data":"7504534fc0a2216c8e5ff3c12f8c4d3881dc0eb5a32baff8bf42126efd6d2dce"} Dec 01 10:43:47 crc kubenswrapper[4761]: I1201 10:43:47.891565 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:47 crc kubenswrapper[4761]: I1201 10:43:47.897834 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c-memberlist\") pod \"speaker-hqff8\" (UID: \"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c\") " pod="metallb-system/speaker-hqff8" Dec 01 10:43:47 crc kubenswrapper[4761]: I1201 10:43:47.925192 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hqff8" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.244051 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-82n8s" event={"ID":"93bcfa9d-c2bd-4d59-9be1-181d49ab1009","Type":"ContainerStarted","Data":"449ac905ec61c7c32c18786c501931d8fdd4cb0a7541871d58c50d445d46254d"} Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.246419 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hqff8" event={"ID":"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c","Type":"ContainerStarted","Data":"4e72685fd4f3ea67e3f88085d46f4cf3d3bfc4bd2a88b2f13a1726d10a88bc25"} Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.246443 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hqff8" event={"ID":"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c","Type":"ContainerStarted","Data":"038359abff2d96eeba8c80b788f180a353ae78cb915b5ef0bece6925fa55298a"} Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.247465 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerStarted","Data":"f450c5864c41a4677cda9ab63f5ab0a73c5a833a8bda965661a6c90fd3c935e1"} Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.344648 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d4r59"] Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.345730 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.353762 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4r59"] Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.397336 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-utilities\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.397395 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4cw\" (UniqueName: \"kubernetes.io/projected/6a22e1fb-567f-4c1f-b825-42c460977caa-kube-api-access-rr4cw\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.397421 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-catalog-content\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.498185 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-utilities\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.498259 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4cw\" (UniqueName: \"kubernetes.io/projected/6a22e1fb-567f-4c1f-b825-42c460977caa-kube-api-access-rr4cw\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.498291 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-catalog-content\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.498660 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-utilities\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.498739 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-catalog-content\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.551450 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4cw\" (UniqueName: \"kubernetes.io/projected/6a22e1fb-567f-4c1f-b825-42c460977caa-kube-api-access-rr4cw\") pod \"redhat-marketplace-d4r59\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:48 crc kubenswrapper[4761]: I1201 10:43:48.663091 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:49 crc kubenswrapper[4761]: I1201 10:43:49.309170 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4r59"] Dec 01 10:43:49 crc kubenswrapper[4761]: W1201 10:43:49.335779 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a22e1fb_567f_4c1f_b825_42c460977caa.slice/crio-d42a46c7547dd3869f5f3d460e4734428d638e5ce4605568127cb5db26af53ca WatchSource:0}: Error finding container d42a46c7547dd3869f5f3d460e4734428d638e5ce4605568127cb5db26af53ca: Status 404 returned error can't find the container with id d42a46c7547dd3869f5f3d460e4734428d638e5ce4605568127cb5db26af53ca Dec 01 10:43:50 crc kubenswrapper[4761]: I1201 10:43:50.260299 4761 generic.go:334] "Generic (PLEG): container finished" podID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerID="22d9b0c8c6382f94c4cd7b5d98dfbd41b07330f173361838f3b90477b32a0077" exitCode=0 Dec 01 10:43:50 crc kubenswrapper[4761]: I1201 10:43:50.260595 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4r59" event={"ID":"6a22e1fb-567f-4c1f-b825-42c460977caa","Type":"ContainerDied","Data":"22d9b0c8c6382f94c4cd7b5d98dfbd41b07330f173361838f3b90477b32a0077"} Dec 01 10:43:50 crc kubenswrapper[4761]: I1201 10:43:50.260642 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4r59" event={"ID":"6a22e1fb-567f-4c1f-b825-42c460977caa","Type":"ContainerStarted","Data":"d42a46c7547dd3869f5f3d460e4734428d638e5ce4605568127cb5db26af53ca"} Dec 01 10:43:51 crc kubenswrapper[4761]: I1201 10:43:51.269756 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-82n8s" event={"ID":"93bcfa9d-c2bd-4d59-9be1-181d49ab1009","Type":"ContainerStarted","Data":"e92cf1d73d568bce685cbf2a86816ebcf13cb39afb77deab690e22c777b06196"} Dec 01 10:43:51 crc kubenswrapper[4761]: I1201 10:43:51.271685 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:43:51 crc kubenswrapper[4761]: I1201 10:43:51.298864 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-82n8s" podStartSLOduration=1.584904895 podStartE2EDuration="5.298841081s" podCreationTimestamp="2025-12-01 10:43:46 +0000 UTC" firstStartedPulling="2025-12-01 10:43:47.336221924 +0000 UTC m=+766.639980558" lastFinishedPulling="2025-12-01 10:43:51.05015811 +0000 UTC m=+770.353916744" observedRunningTime="2025-12-01 10:43:51.28777489 +0000 UTC m=+770.591533514" watchObservedRunningTime="2025-12-01 10:43:51.298841081 +0000 UTC m=+770.602599705" Dec 01 10:43:52 crc kubenswrapper[4761]: I1201 10:43:52.276374 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hqff8" event={"ID":"d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c","Type":"ContainerStarted","Data":"f3703f80d7b7205096d893296e60c1659968dab2bb935f57e760de2ab1f8a769"} Dec 01 10:43:52 crc kubenswrapper[4761]: I1201 10:43:52.276684 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-hqff8" Dec 01 10:43:52 crc kubenswrapper[4761]: I1201 10:43:52.285447 4761 generic.go:334] "Generic (PLEG): container finished" podID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerID="a18fddf327d04caed7941239e260d578fa2ec2a95cbdda77bf86e1da2220e913" exitCode=0 Dec 01 10:43:52 crc kubenswrapper[4761]: I1201 10:43:52.285560 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4r59" event={"ID":"6a22e1fb-567f-4c1f-b825-42c460977caa","Type":"ContainerDied","Data":"a18fddf327d04caed7941239e260d578fa2ec2a95cbdda77bf86e1da2220e913"} Dec 01 10:43:52 crc kubenswrapper[4761]: I1201 10:43:52.296874 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-hqff8" podStartSLOduration=3.44155178 podStartE2EDuration="6.296851079s" podCreationTimestamp="2025-12-01 10:43:46 +0000 UTC" firstStartedPulling="2025-12-01 10:43:48.184291973 +0000 UTC m=+767.488050597" lastFinishedPulling="2025-12-01 10:43:51.039591272 +0000 UTC m=+770.343349896" observedRunningTime="2025-12-01 10:43:52.292003716 +0000 UTC m=+771.595762370" watchObservedRunningTime="2025-12-01 10:43:52.296851079 +0000 UTC m=+771.600609723" Dec 01 10:43:53 crc kubenswrapper[4761]: I1201 10:43:53.954569 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dd845"] Dec 01 10:43:53 crc kubenswrapper[4761]: I1201 10:43:53.958075 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:53 crc kubenswrapper[4761]: I1201 10:43:53.984262 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dd845"] Dec 01 10:43:53 crc kubenswrapper[4761]: I1201 10:43:53.989433 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbqnd\" (UniqueName: \"kubernetes.io/projected/7fe41d03-d5a8-491b-b1c3-6337c71d5740-kube-api-access-fbqnd\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:53 crc kubenswrapper[4761]: I1201 10:43:53.989495 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-utilities\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:53 crc kubenswrapper[4761]: I1201 10:43:53.989523 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-catalog-content\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:54 crc kubenswrapper[4761]: I1201 10:43:54.091505 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbqnd\" (UniqueName: \"kubernetes.io/projected/7fe41d03-d5a8-491b-b1c3-6337c71d5740-kube-api-access-fbqnd\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:54 crc kubenswrapper[4761]: I1201 10:43:54.091791 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-utilities\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:54 crc kubenswrapper[4761]: I1201 10:43:54.091892 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-catalog-content\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:54 crc kubenswrapper[4761]: I1201 10:43:54.092397 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-catalog-content\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:54 crc kubenswrapper[4761]: I1201 10:43:54.092740 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-utilities\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:54 crc kubenswrapper[4761]: I1201 10:43:54.121874 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbqnd\" (UniqueName: \"kubernetes.io/projected/7fe41d03-d5a8-491b-b1c3-6337c71d5740-kube-api-access-fbqnd\") pod \"redhat-operators-dd845\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:54 crc kubenswrapper[4761]: I1201 10:43:54.279060 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:43:56 crc kubenswrapper[4761]: I1201 10:43:56.232033 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dd845"] Dec 01 10:43:56 crc kubenswrapper[4761]: W1201 10:43:56.240730 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fe41d03_d5a8_491b_b1c3_6337c71d5740.slice/crio-dca792b086c5e41c26e47c4ea8b4416390b926d8e63eb1b17059f9f88bd86ec2 WatchSource:0}: Error finding container dca792b086c5e41c26e47c4ea8b4416390b926d8e63eb1b17059f9f88bd86ec2: Status 404 returned error can't find the container with id dca792b086c5e41c26e47c4ea8b4416390b926d8e63eb1b17059f9f88bd86ec2 Dec 01 10:43:56 crc kubenswrapper[4761]: I1201 10:43:56.313461 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" event={"ID":"d5fccf55-8452-4691-9d4b-d27b6c9e0a2f","Type":"ContainerStarted","Data":"ef05af6ddf012f8836145992b1919e63cfe11e72798c8ae9c3fcd0b4eb8e0412"} Dec 01 10:43:56 crc kubenswrapper[4761]: I1201 10:43:56.313691 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:43:56 crc kubenswrapper[4761]: I1201 10:43:56.317774 4761 generic.go:334] "Generic (PLEG): container finished" podID="4bdee341-d432-4260-8334-4c47aee1593a" containerID="a42f6a914a94b9ffabc0cb805a2f1f500a622a26f0192f6dc1669ffdd23a96f3" exitCode=0 Dec 01 10:43:56 crc kubenswrapper[4761]: I1201 10:43:56.317861 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerDied","Data":"a42f6a914a94b9ffabc0cb805a2f1f500a622a26f0192f6dc1669ffdd23a96f3"} Dec 01 10:43:56 crc kubenswrapper[4761]: I1201 10:43:56.319648 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd845" event={"ID":"7fe41d03-d5a8-491b-b1c3-6337c71d5740","Type":"ContainerStarted","Data":"dca792b086c5e41c26e47c4ea8b4416390b926d8e63eb1b17059f9f88bd86ec2"} Dec 01 10:43:56 crc kubenswrapper[4761]: I1201 10:43:56.334267 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" podStartSLOduration=1.627150256 podStartE2EDuration="10.33424455s" podCreationTimestamp="2025-12-01 10:43:46 +0000 UTC" firstStartedPulling="2025-12-01 10:43:47.196860623 +0000 UTC m=+766.500619247" lastFinishedPulling="2025-12-01 10:43:55.903954917 +0000 UTC m=+775.207713541" observedRunningTime="2025-12-01 10:43:56.334051565 +0000 UTC m=+775.637810189" watchObservedRunningTime="2025-12-01 10:43:56.33424455 +0000 UTC m=+775.638003164" Dec 01 10:43:57 crc kubenswrapper[4761]: I1201 10:43:57.328880 4761 generic.go:334] "Generic (PLEG): container finished" podID="4bdee341-d432-4260-8334-4c47aee1593a" containerID="921f9a5d2f7becff0436850728131a8720c9652192a8f67332fcb3b612c07764" exitCode=0 Dec 01 10:43:57 crc kubenswrapper[4761]: I1201 10:43:57.329022 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerDied","Data":"921f9a5d2f7becff0436850728131a8720c9652192a8f67332fcb3b612c07764"} Dec 01 10:43:57 crc kubenswrapper[4761]: I1201 10:43:57.331013 4761 generic.go:334] "Generic (PLEG): container finished" podID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerID="681d76ea5191f1a406fea5c3f668dca6309bef1bf942a1107a82cad252b67907" exitCode=0 Dec 01 10:43:57 crc kubenswrapper[4761]: I1201 10:43:57.331098 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd845" event={"ID":"7fe41d03-d5a8-491b-b1c3-6337c71d5740","Type":"ContainerDied","Data":"681d76ea5191f1a406fea5c3f668dca6309bef1bf942a1107a82cad252b67907"} Dec 01 10:43:57 crc kubenswrapper[4761]: I1201 10:43:57.336831 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4r59" event={"ID":"6a22e1fb-567f-4c1f-b825-42c460977caa","Type":"ContainerStarted","Data":"818203d3e43c54325e56064a091da422fd5a3f87e7d973e7d86c4ccb6c98e88c"} Dec 01 10:43:57 crc kubenswrapper[4761]: I1201 10:43:57.407676 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d4r59" podStartSLOduration=4.161426146 podStartE2EDuration="9.407655068s" podCreationTimestamp="2025-12-01 10:43:48 +0000 UTC" firstStartedPulling="2025-12-01 10:43:50.942645715 +0000 UTC m=+770.246404339" lastFinishedPulling="2025-12-01 10:43:56.188874627 +0000 UTC m=+775.492633261" observedRunningTime="2025-12-01 10:43:57.403628366 +0000 UTC m=+776.707386990" watchObservedRunningTime="2025-12-01 10:43:57.407655068 +0000 UTC m=+776.711413702" Dec 01 10:43:58 crc kubenswrapper[4761]: I1201 10:43:58.345244 4761 generic.go:334] "Generic (PLEG): container finished" podID="4bdee341-d432-4260-8334-4c47aee1593a" containerID="2244328418b9a1aceca852158307fa721e2d4cdf268b18af52de81b32b2e277e" exitCode=0 Dec 01 10:43:58 crc kubenswrapper[4761]: I1201 10:43:58.345810 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerDied","Data":"2244328418b9a1aceca852158307fa721e2d4cdf268b18af52de81b32b2e277e"} Dec 01 10:43:58 crc kubenswrapper[4761]: I1201 10:43:58.664149 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:58 crc kubenswrapper[4761]: I1201 10:43:58.664457 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:58 crc kubenswrapper[4761]: I1201 10:43:58.717663 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:43:59 crc kubenswrapper[4761]: I1201 10:43:59.366453 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerStarted","Data":"ee4e932a509ff3eac8c4750abe9b536935c3b6963851c5015c0573804b50c1ee"} Dec 01 10:43:59 crc kubenswrapper[4761]: I1201 10:43:59.366504 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerStarted","Data":"6583b2475e7953e35446a1b6811388194040acce4699178d630aefe68afa5b6c"} Dec 01 10:43:59 crc kubenswrapper[4761]: I1201 10:43:59.366514 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerStarted","Data":"6423d5507d2904fa7f0a17936cae78d2a8d15ac0a99d196788367470f664af65"} Dec 01 10:43:59 crc kubenswrapper[4761]: I1201 10:43:59.366523 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerStarted","Data":"50abc68f95531bb94bf58f76c5f5dcae057ce76e84b1050f7f829425c5f5c72e"} Dec 01 10:43:59 crc kubenswrapper[4761]: I1201 10:43:59.366530 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerStarted","Data":"0675fadd40df6d5d31940d4b97f3a97e66fb115dcb4a01a85cf73b3290674405"} Dec 01 10:43:59 crc kubenswrapper[4761]: I1201 10:43:59.368768 4761 generic.go:334] "Generic (PLEG): container finished" podID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerID="670e45a5fcf679966383758191d81d1202ea49287998904536382cc521b3ea76" exitCode=0 Dec 01 10:43:59 crc kubenswrapper[4761]: I1201 10:43:59.368802 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd845" event={"ID":"7fe41d03-d5a8-491b-b1c3-6337c71d5740","Type":"ContainerDied","Data":"670e45a5fcf679966383758191d81d1202ea49287998904536382cc521b3ea76"} Dec 01 10:44:00 crc kubenswrapper[4761]: I1201 10:44:00.379411 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtww8" event={"ID":"4bdee341-d432-4260-8334-4c47aee1593a","Type":"ContainerStarted","Data":"bc91cb8cf4320768a567b34b952891085cdd3298f324584d1ce6243992c9c398"} Dec 01 10:44:00 crc kubenswrapper[4761]: I1201 10:44:00.380672 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-dtww8" Dec 01 10:44:00 crc kubenswrapper[4761]: I1201 10:44:00.383638 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd845" event={"ID":"7fe41d03-d5a8-491b-b1c3-6337c71d5740","Type":"ContainerStarted","Data":"9b82b93981e691f7476551f7a696247723e07f29663671ed807944bed2fac5cf"} Dec 01 10:44:00 crc kubenswrapper[4761]: I1201 10:44:00.416411 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-dtww8" podStartSLOduration=5.810591746 podStartE2EDuration="14.416383164s" podCreationTimestamp="2025-12-01 10:43:46 +0000 UTC" firstStartedPulling="2025-12-01 10:43:47.317163601 +0000 UTC m=+766.620922235" lastFinishedPulling="2025-12-01 10:43:55.922955029 +0000 UTC m=+775.226713653" observedRunningTime="2025-12-01 10:44:00.411687395 +0000 UTC m=+779.715446019" watchObservedRunningTime="2025-12-01 10:44:00.416383164 +0000 UTC m=+779.720141828" Dec 01 10:44:00 crc kubenswrapper[4761]: I1201 10:44:00.439249 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dd845" podStartSLOduration=4.545703375 podStartE2EDuration="7.439230863s" podCreationTimestamp="2025-12-01 10:43:53 +0000 UTC" firstStartedPulling="2025-12-01 10:43:57.3339441 +0000 UTC m=+776.637702764" lastFinishedPulling="2025-12-01 10:44:00.227471628 +0000 UTC m=+779.531230252" observedRunningTime="2025-12-01 10:44:00.436215976 +0000 UTC m=+779.739974600" watchObservedRunningTime="2025-12-01 10:44:00.439230863 +0000 UTC m=+779.742989487" Dec 01 10:44:01 crc kubenswrapper[4761]: I1201 10:44:01.975199 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-dtww8" Dec 01 10:44:02 crc kubenswrapper[4761]: I1201 10:44:02.011836 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-dtww8" Dec 01 10:44:03 crc kubenswrapper[4761]: I1201 10:44:03.850496 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:44:03 crc kubenswrapper[4761]: I1201 10:44:03.850615 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:44:04 crc kubenswrapper[4761]: I1201 10:44:04.279507 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:44:04 crc kubenswrapper[4761]: I1201 10:44:04.279684 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:44:05 crc kubenswrapper[4761]: I1201 10:44:05.365633 4761 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dd845" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="registry-server" probeResult="failure" output=< Dec 01 10:44:05 crc kubenswrapper[4761]: timeout: failed to connect service ":50051" within 1s Dec 01 10:44:05 crc kubenswrapper[4761]: > Dec 01 10:44:06 crc kubenswrapper[4761]: I1201 10:44:06.445253 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-82n8s" Dec 01 10:44:06 crc kubenswrapper[4761]: I1201 10:44:06.989129 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t6f4w" Dec 01 10:44:07 crc kubenswrapper[4761]: I1201 10:44:07.930648 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-hqff8" Dec 01 10:44:08 crc kubenswrapper[4761]: I1201 10:44:08.715786 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:44:08 crc kubenswrapper[4761]: I1201 10:44:08.769062 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4r59"] Dec 01 10:44:09 crc kubenswrapper[4761]: I1201 10:44:09.445387 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d4r59" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerName="registry-server" containerID="cri-o://818203d3e43c54325e56064a091da422fd5a3f87e7d973e7d86c4ccb6c98e88c" gracePeriod=2 Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.466910 4761 generic.go:334] "Generic (PLEG): container finished" podID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerID="818203d3e43c54325e56064a091da422fd5a3f87e7d973e7d86c4ccb6c98e88c" exitCode=0 Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.467009 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4r59" event={"ID":"6a22e1fb-567f-4c1f-b825-42c460977caa","Type":"ContainerDied","Data":"818203d3e43c54325e56064a091da422fd5a3f87e7d973e7d86c4ccb6c98e88c"} Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.729355 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.877177 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr4cw\" (UniqueName: \"kubernetes.io/projected/6a22e1fb-567f-4c1f-b825-42c460977caa-kube-api-access-rr4cw\") pod \"6a22e1fb-567f-4c1f-b825-42c460977caa\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.877262 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-catalog-content\") pod \"6a22e1fb-567f-4c1f-b825-42c460977caa\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.877312 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-utilities\") pod \"6a22e1fb-567f-4c1f-b825-42c460977caa\" (UID: \"6a22e1fb-567f-4c1f-b825-42c460977caa\") " Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.878159 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-utilities" (OuterVolumeSpecName: "utilities") pod "6a22e1fb-567f-4c1f-b825-42c460977caa" (UID: "6a22e1fb-567f-4c1f-b825-42c460977caa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.884444 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a22e1fb-567f-4c1f-b825-42c460977caa-kube-api-access-rr4cw" (OuterVolumeSpecName: "kube-api-access-rr4cw") pod "6a22e1fb-567f-4c1f-b825-42c460977caa" (UID: "6a22e1fb-567f-4c1f-b825-42c460977caa"). InnerVolumeSpecName "kube-api-access-rr4cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.894006 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a22e1fb-567f-4c1f-b825-42c460977caa" (UID: "6a22e1fb-567f-4c1f-b825-42c460977caa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.978371 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.978413 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr4cw\" (UniqueName: \"kubernetes.io/projected/6a22e1fb-567f-4c1f-b825-42c460977caa-kube-api-access-rr4cw\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:11 crc kubenswrapper[4761]: I1201 10:44:11.978427 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a22e1fb-567f-4c1f-b825-42c460977caa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:12 crc kubenswrapper[4761]: I1201 10:44:12.484605 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4r59" event={"ID":"6a22e1fb-567f-4c1f-b825-42c460977caa","Type":"ContainerDied","Data":"d42a46c7547dd3869f5f3d460e4734428d638e5ce4605568127cb5db26af53ca"} Dec 01 10:44:12 crc kubenswrapper[4761]: I1201 10:44:12.484741 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4r59" Dec 01 10:44:12 crc kubenswrapper[4761]: I1201 10:44:12.484970 4761 scope.go:117] "RemoveContainer" containerID="818203d3e43c54325e56064a091da422fd5a3f87e7d973e7d86c4ccb6c98e88c" Dec 01 10:44:12 crc kubenswrapper[4761]: I1201 10:44:12.510580 4761 scope.go:117] "RemoveContainer" containerID="a18fddf327d04caed7941239e260d578fa2ec2a95cbdda77bf86e1da2220e913" Dec 01 10:44:12 crc kubenswrapper[4761]: I1201 10:44:12.522216 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4r59"] Dec 01 10:44:12 crc kubenswrapper[4761]: I1201 10:44:12.531963 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4r59"] Dec 01 10:44:12 crc kubenswrapper[4761]: I1201 10:44:12.545170 4761 scope.go:117] "RemoveContainer" containerID="22d9b0c8c6382f94c4cd7b5d98dfbd41b07330f173361838f3b90477b32a0077" Dec 01 10:44:13 crc kubenswrapper[4761]: I1201 10:44:13.142204 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" path="/var/lib/kubelet/pods/6a22e1fb-567f-4c1f-b825-42c460977caa/volumes" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.250212 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-qlgxz"] Dec 01 10:44:14 crc kubenswrapper[4761]: E1201 10:44:14.250725 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerName="extract-utilities" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.250758 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerName="extract-utilities" Dec 01 10:44:14 crc kubenswrapper[4761]: E1201 10:44:14.250781 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerName="registry-server" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.250797 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerName="registry-server" Dec 01 10:44:14 crc kubenswrapper[4761]: E1201 10:44:14.250831 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerName="extract-content" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.250848 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerName="extract-content" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.251106 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a22e1fb-567f-4c1f-b825-42c460977caa" containerName="registry-server" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.252114 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-qlgxz" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.255121 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.255787 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.255953 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-lfbnh" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.271983 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-qlgxz"] Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.309729 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swptj\" (UniqueName: \"kubernetes.io/projected/8eaf4828-3f26-4a1d-a343-f12e11d9ec70-kube-api-access-swptj\") pod \"mariadb-operator-index-qlgxz\" (UID: \"8eaf4828-3f26-4a1d-a343-f12e11d9ec70\") " pod="openstack-operators/mariadb-operator-index-qlgxz" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.330982 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.410875 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swptj\" (UniqueName: \"kubernetes.io/projected/8eaf4828-3f26-4a1d-a343-f12e11d9ec70-kube-api-access-swptj\") pod \"mariadb-operator-index-qlgxz\" (UID: \"8eaf4828-3f26-4a1d-a343-f12e11d9ec70\") " pod="openstack-operators/mariadb-operator-index-qlgxz" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.429255 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.432887 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swptj\" (UniqueName: \"kubernetes.io/projected/8eaf4828-3f26-4a1d-a343-f12e11d9ec70-kube-api-access-swptj\") pod \"mariadb-operator-index-qlgxz\" (UID: \"8eaf4828-3f26-4a1d-a343-f12e11d9ec70\") " pod="openstack-operators/mariadb-operator-index-qlgxz" Dec 01 10:44:14 crc kubenswrapper[4761]: I1201 10:44:14.577037 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-qlgxz" Dec 01 10:44:15 crc kubenswrapper[4761]: I1201 10:44:15.070576 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-qlgxz"] Dec 01 10:44:15 crc kubenswrapper[4761]: I1201 10:44:15.521539 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-qlgxz" event={"ID":"8eaf4828-3f26-4a1d-a343-f12e11d9ec70","Type":"ContainerStarted","Data":"7a88ee9864373dea5d564287b93b6bc8caa12cf7eb42b439efec9aea70cff912"} Dec 01 10:44:16 crc kubenswrapper[4761]: I1201 10:44:16.998602 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-dtww8" Dec 01 10:44:18 crc kubenswrapper[4761]: I1201 10:44:18.770387 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-qlgxz"] Dec 01 10:44:18 crc kubenswrapper[4761]: I1201 10:44:18.981086 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dd845"] Dec 01 10:44:18 crc kubenswrapper[4761]: I1201 10:44:18.981507 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dd845" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="registry-server" containerID="cri-o://9b82b93981e691f7476551f7a696247723e07f29663671ed807944bed2fac5cf" gracePeriod=2 Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.546969 4761 generic.go:334] "Generic (PLEG): container finished" podID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerID="9b82b93981e691f7476551f7a696247723e07f29663671ed807944bed2fac5cf" exitCode=0 Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.547032 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd845" event={"ID":"7fe41d03-d5a8-491b-b1c3-6337c71d5740","Type":"ContainerDied","Data":"9b82b93981e691f7476551f7a696247723e07f29663671ed807944bed2fac5cf"} Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.581699 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-mgkj6"] Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.582593 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.586329 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-mgkj6"] Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.687281 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmcd4\" (UniqueName: \"kubernetes.io/projected/594f3896-fe41-4a3b-878d-849501100194-kube-api-access-kmcd4\") pod \"mariadb-operator-index-mgkj6\" (UID: \"594f3896-fe41-4a3b-878d-849501100194\") " pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.788944 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmcd4\" (UniqueName: \"kubernetes.io/projected/594f3896-fe41-4a3b-878d-849501100194-kube-api-access-kmcd4\") pod \"mariadb-operator-index-mgkj6\" (UID: \"594f3896-fe41-4a3b-878d-849501100194\") " pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.808613 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmcd4\" (UniqueName: \"kubernetes.io/projected/594f3896-fe41-4a3b-878d-849501100194-kube-api-access-kmcd4\") pod \"mariadb-operator-index-mgkj6\" (UID: \"594f3896-fe41-4a3b-878d-849501100194\") " pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:19 crc kubenswrapper[4761]: I1201 10:44:19.907476 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:23 crc kubenswrapper[4761]: I1201 10:44:23.757621 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:44:23 crc kubenswrapper[4761]: I1201 10:44:23.839332 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-catalog-content\") pod \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " Dec 01 10:44:23 crc kubenswrapper[4761]: I1201 10:44:23.839420 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbqnd\" (UniqueName: \"kubernetes.io/projected/7fe41d03-d5a8-491b-b1c3-6337c71d5740-kube-api-access-fbqnd\") pod \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " Dec 01 10:44:23 crc kubenswrapper[4761]: I1201 10:44:23.839490 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-utilities\") pod \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\" (UID: \"7fe41d03-d5a8-491b-b1c3-6337c71d5740\") " Dec 01 10:44:23 crc kubenswrapper[4761]: I1201 10:44:23.840204 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-utilities" (OuterVolumeSpecName: "utilities") pod "7fe41d03-d5a8-491b-b1c3-6337c71d5740" (UID: "7fe41d03-d5a8-491b-b1c3-6337c71d5740"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:23 crc kubenswrapper[4761]: I1201 10:44:23.846009 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe41d03-d5a8-491b-b1c3-6337c71d5740-kube-api-access-fbqnd" (OuterVolumeSpecName: "kube-api-access-fbqnd") pod "7fe41d03-d5a8-491b-b1c3-6337c71d5740" (UID: "7fe41d03-d5a8-491b-b1c3-6337c71d5740"). InnerVolumeSpecName "kube-api-access-fbqnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:44:23 crc kubenswrapper[4761]: I1201 10:44:23.942333 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbqnd\" (UniqueName: \"kubernetes.io/projected/7fe41d03-d5a8-491b-b1c3-6337c71d5740-kube-api-access-fbqnd\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:23 crc kubenswrapper[4761]: I1201 10:44:23.942375 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.001815 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fe41d03-d5a8-491b-b1c3-6337c71d5740" (UID: "7fe41d03-d5a8-491b-b1c3-6337c71d5740"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.043692 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fe41d03-d5a8-491b-b1c3-6337c71d5740-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.331989 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-mgkj6"] Dec 01 10:44:24 crc kubenswrapper[4761]: W1201 10:44:24.338086 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod594f3896_fe41_4a3b_878d_849501100194.slice/crio-edcb653c1a3e8b97e3b281985c260df8dbdadc87604c9f7e0d203ec7b420071e WatchSource:0}: Error finding container edcb653c1a3e8b97e3b281985c260df8dbdadc87604c9f7e0d203ec7b420071e: Status 404 returned error can't find the container with id edcb653c1a3e8b97e3b281985c260df8dbdadc87604c9f7e0d203ec7b420071e Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.582072 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-mgkj6" event={"ID":"594f3896-fe41-4a3b-878d-849501100194","Type":"ContainerStarted","Data":"edcb653c1a3e8b97e3b281985c260df8dbdadc87604c9f7e0d203ec7b420071e"} Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.583733 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-qlgxz" event={"ID":"8eaf4828-3f26-4a1d-a343-f12e11d9ec70","Type":"ContainerStarted","Data":"14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe"} Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.583821 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-qlgxz" podUID="8eaf4828-3f26-4a1d-a343-f12e11d9ec70" containerName="registry-server" containerID="cri-o://14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe" gracePeriod=2 Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.590366 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd845" event={"ID":"7fe41d03-d5a8-491b-b1c3-6337c71d5740","Type":"ContainerDied","Data":"dca792b086c5e41c26e47c4ea8b4416390b926d8e63eb1b17059f9f88bd86ec2"} Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.590461 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd845" Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.590800 4761 scope.go:117] "RemoveContainer" containerID="9b82b93981e691f7476551f7a696247723e07f29663671ed807944bed2fac5cf" Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.610351 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-qlgxz" podStartSLOduration=1.666963628 podStartE2EDuration="10.610332629s" podCreationTimestamp="2025-12-01 10:44:14 +0000 UTC" firstStartedPulling="2025-12-01 10:44:15.078455226 +0000 UTC m=+794.382213870" lastFinishedPulling="2025-12-01 10:44:24.021824247 +0000 UTC m=+803.325582871" observedRunningTime="2025-12-01 10:44:24.602159522 +0000 UTC m=+803.905918166" watchObservedRunningTime="2025-12-01 10:44:24.610332629 +0000 UTC m=+803.914091253" Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.618808 4761 scope.go:117] "RemoveContainer" containerID="670e45a5fcf679966383758191d81d1202ea49287998904536382cc521b3ea76" Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.638581 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dd845"] Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.645467 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dd845"] Dec 01 10:44:24 crc kubenswrapper[4761]: I1201 10:44:24.710261 4761 scope.go:117] "RemoveContainer" containerID="681d76ea5191f1a406fea5c3f668dca6309bef1bf942a1107a82cad252b67907" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.076386 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-qlgxz" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.139613 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" path="/var/lib/kubelet/pods/7fe41d03-d5a8-491b-b1c3-6337c71d5740/volumes" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.165817 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swptj\" (UniqueName: \"kubernetes.io/projected/8eaf4828-3f26-4a1d-a343-f12e11d9ec70-kube-api-access-swptj\") pod \"8eaf4828-3f26-4a1d-a343-f12e11d9ec70\" (UID: \"8eaf4828-3f26-4a1d-a343-f12e11d9ec70\") " Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.170898 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eaf4828-3f26-4a1d-a343-f12e11d9ec70-kube-api-access-swptj" (OuterVolumeSpecName: "kube-api-access-swptj") pod "8eaf4828-3f26-4a1d-a343-f12e11d9ec70" (UID: "8eaf4828-3f26-4a1d-a343-f12e11d9ec70"). InnerVolumeSpecName "kube-api-access-swptj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.267939 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swptj\" (UniqueName: \"kubernetes.io/projected/8eaf4828-3f26-4a1d-a343-f12e11d9ec70-kube-api-access-swptj\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.605198 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-mgkj6" event={"ID":"594f3896-fe41-4a3b-878d-849501100194","Type":"ContainerStarted","Data":"bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549"} Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.608950 4761 generic.go:334] "Generic (PLEG): container finished" podID="8eaf4828-3f26-4a1d-a343-f12e11d9ec70" containerID="14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe" exitCode=0 Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.608988 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-qlgxz" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.609083 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-qlgxz" event={"ID":"8eaf4828-3f26-4a1d-a343-f12e11d9ec70","Type":"ContainerDied","Data":"14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe"} Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.609187 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-qlgxz" event={"ID":"8eaf4828-3f26-4a1d-a343-f12e11d9ec70","Type":"ContainerDied","Data":"7a88ee9864373dea5d564287b93b6bc8caa12cf7eb42b439efec9aea70cff912"} Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.609219 4761 scope.go:117] "RemoveContainer" containerID="14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.637146 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-mgkj6" podStartSLOduration=6.066373184 podStartE2EDuration="6.636973342s" podCreationTimestamp="2025-12-01 10:44:19 +0000 UTC" firstStartedPulling="2025-12-01 10:44:24.346607377 +0000 UTC m=+803.650366001" lastFinishedPulling="2025-12-01 10:44:24.917207535 +0000 UTC m=+804.220966159" observedRunningTime="2025-12-01 10:44:25.629186765 +0000 UTC m=+804.932945419" watchObservedRunningTime="2025-12-01 10:44:25.636973342 +0000 UTC m=+804.940732006" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.648725 4761 scope.go:117] "RemoveContainer" containerID="14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe" Dec 01 10:44:25 crc kubenswrapper[4761]: E1201 10:44:25.651798 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe\": container with ID starting with 14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe not found: ID does not exist" containerID="14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.651937 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe"} err="failed to get container status \"14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe\": rpc error: code = NotFound desc = could not find container \"14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe\": container with ID starting with 14dc4417180f316d16f5f558d03a721b8a4469a3e4cadc46a3ec356b683694fe not found: ID does not exist" Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.678520 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-qlgxz"] Dec 01 10:44:25 crc kubenswrapper[4761]: I1201 10:44:25.682950 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-qlgxz"] Dec 01 10:44:27 crc kubenswrapper[4761]: I1201 10:44:27.135804 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eaf4828-3f26-4a1d-a343-f12e11d9ec70" path="/var/lib/kubelet/pods/8eaf4828-3f26-4a1d-a343-f12e11d9ec70/volumes" Dec 01 10:44:29 crc kubenswrapper[4761]: I1201 10:44:29.908102 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:29 crc kubenswrapper[4761]: I1201 10:44:29.908159 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:29 crc kubenswrapper[4761]: I1201 10:44:29.951921 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:30 crc kubenswrapper[4761]: I1201 10:44:30.685518 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.438926 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn"] Dec 01 10:44:33 crc kubenswrapper[4761]: E1201 10:44:33.439871 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="registry-server" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.439908 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="registry-server" Dec 01 10:44:33 crc kubenswrapper[4761]: E1201 10:44:33.439969 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaf4828-3f26-4a1d-a343-f12e11d9ec70" containerName="registry-server" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.439988 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaf4828-3f26-4a1d-a343-f12e11d9ec70" containerName="registry-server" Dec 01 10:44:33 crc kubenswrapper[4761]: E1201 10:44:33.440031 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="extract-utilities" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.440050 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="extract-utilities" Dec 01 10:44:33 crc kubenswrapper[4761]: E1201 10:44:33.440079 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="extract-content" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.440095 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="extract-content" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.440335 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eaf4828-3f26-4a1d-a343-f12e11d9ec70" containerName="registry-server" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.440392 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe41d03-d5a8-491b-b1c3-6337c71d5740" containerName="registry-server" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.441315 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.448242 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn"] Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.448451 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8w9gk" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.486776 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kntx\" (UniqueName: \"kubernetes.io/projected/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-kube-api-access-4kntx\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.487197 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.487241 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.588763 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kntx\" (UniqueName: \"kubernetes.io/projected/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-kube-api-access-4kntx\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.588890 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.588952 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.589807 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.589929 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.625146 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kntx\" (UniqueName: \"kubernetes.io/projected/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-kube-api-access-4kntx\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.768139 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.851150 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:44:33 crc kubenswrapper[4761]: I1201 10:44:33.851232 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:44:34 crc kubenswrapper[4761]: I1201 10:44:34.200903 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn"] Dec 01 10:44:34 crc kubenswrapper[4761]: I1201 10:44:34.676880 4761 generic.go:334] "Generic (PLEG): container finished" podID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerID="db435de52d06795a3379839fca9be76cdafb208e1fb066917b90badef32e43b6" exitCode=0 Dec 01 10:44:34 crc kubenswrapper[4761]: I1201 10:44:34.676922 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" event={"ID":"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f","Type":"ContainerDied","Data":"db435de52d06795a3379839fca9be76cdafb208e1fb066917b90badef32e43b6"} Dec 01 10:44:34 crc kubenswrapper[4761]: I1201 10:44:34.676950 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" event={"ID":"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f","Type":"ContainerStarted","Data":"9dc72c504ed432802d634e47d4e4547fd3f63553834c70ba944561cd28c6ee5e"} Dec 01 10:44:36 crc kubenswrapper[4761]: I1201 10:44:36.698924 4761 generic.go:334] "Generic (PLEG): container finished" podID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerID="0c3dd15463622322d202883f390010b0d27b1b8df8db8d2e2be033ebbf98e8f9" exitCode=0 Dec 01 10:44:36 crc kubenswrapper[4761]: I1201 10:44:36.699062 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" event={"ID":"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f","Type":"ContainerDied","Data":"0c3dd15463622322d202883f390010b0d27b1b8df8db8d2e2be033ebbf98e8f9"} Dec 01 10:44:37 crc kubenswrapper[4761]: I1201 10:44:37.709782 4761 generic.go:334] "Generic (PLEG): container finished" podID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerID="a44d4cca0b0bf03637c4a7ffe033f74179ed74f2deb924b28c83bc4b34bcff99" exitCode=0 Dec 01 10:44:37 crc kubenswrapper[4761]: I1201 10:44:37.709912 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" event={"ID":"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f","Type":"ContainerDied","Data":"a44d4cca0b0bf03637c4a7ffe033f74179ed74f2deb924b28c83bc4b34bcff99"} Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.002187 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.075085 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-util\") pod \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.075149 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-bundle\") pod \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.075283 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kntx\" (UniqueName: \"kubernetes.io/projected/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-kube-api-access-4kntx\") pod \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\" (UID: \"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f\") " Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.076318 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-bundle" (OuterVolumeSpecName: "bundle") pod "4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" (UID: "4da4d646-d8b2-40dc-8a0e-9f66b3567d3f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.083278 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-kube-api-access-4kntx" (OuterVolumeSpecName: "kube-api-access-4kntx") pod "4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" (UID: "4da4d646-d8b2-40dc-8a0e-9f66b3567d3f"). InnerVolumeSpecName "kube-api-access-4kntx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.093909 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-util" (OuterVolumeSpecName: "util") pod "4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" (UID: "4da4d646-d8b2-40dc-8a0e-9f66b3567d3f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.177492 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kntx\" (UniqueName: \"kubernetes.io/projected/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-kube-api-access-4kntx\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.177533 4761 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.177565 4761 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.729113 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" event={"ID":"4da4d646-d8b2-40dc-8a0e-9f66b3567d3f","Type":"ContainerDied","Data":"9dc72c504ed432802d634e47d4e4547fd3f63553834c70ba944561cd28c6ee5e"} Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.729176 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dc72c504ed432802d634e47d4e4547fd3f63553834c70ba944561cd28c6ee5e" Dec 01 10:44:39 crc kubenswrapper[4761]: I1201 10:44:39.729203 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.801603 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q"] Dec 01 10:44:42 crc kubenswrapper[4761]: E1201 10:44:42.802167 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerName="util" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.802425 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerName="util" Dec 01 10:44:42 crc kubenswrapper[4761]: E1201 10:44:42.802439 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerName="pull" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.802449 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerName="pull" Dec 01 10:44:42 crc kubenswrapper[4761]: E1201 10:44:42.802473 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerName="extract" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.802481 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerName="extract" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.802623 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" containerName="extract" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.803077 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.806113 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.806906 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.807228 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-l4pwb" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.829162 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-apiservice-cert\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.829207 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-webhook-cert\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.829256 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq4tc\" (UniqueName: \"kubernetes.io/projected/635c1195-66ca-4595-8f7d-cb66e37db30f-kube-api-access-tq4tc\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.830674 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q"] Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.930655 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-apiservice-cert\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.930701 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-webhook-cert\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.930753 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq4tc\" (UniqueName: \"kubernetes.io/projected/635c1195-66ca-4595-8f7d-cb66e37db30f-kube-api-access-tq4tc\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.935882 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-apiservice-cert\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.935885 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-webhook-cert\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:42 crc kubenswrapper[4761]: I1201 10:44:42.950612 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq4tc\" (UniqueName: \"kubernetes.io/projected/635c1195-66ca-4595-8f7d-cb66e37db30f-kube-api-access-tq4tc\") pod \"mariadb-operator-controller-manager-67d6f98b9-pxc6q\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:43 crc kubenswrapper[4761]: I1201 10:44:43.116584 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:43 crc kubenswrapper[4761]: I1201 10:44:43.585320 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q"] Dec 01 10:44:43 crc kubenswrapper[4761]: W1201 10:44:43.594668 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635c1195_66ca_4595_8f7d_cb66e37db30f.slice/crio-ce748ec2908b38c3c7af51aa5f424832ae2eed33eee71d294cdaf03dd3520f81 WatchSource:0}: Error finding container ce748ec2908b38c3c7af51aa5f424832ae2eed33eee71d294cdaf03dd3520f81: Status 404 returned error can't find the container with id ce748ec2908b38c3c7af51aa5f424832ae2eed33eee71d294cdaf03dd3520f81 Dec 01 10:44:43 crc kubenswrapper[4761]: I1201 10:44:43.753796 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" event={"ID":"635c1195-66ca-4595-8f7d-cb66e37db30f","Type":"ContainerStarted","Data":"ce748ec2908b38c3c7af51aa5f424832ae2eed33eee71d294cdaf03dd3520f81"} Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.680018 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c5bbw"] Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.682031 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.684941 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5bbw"] Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.707450 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-utilities\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.707532 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-catalog-content\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.707723 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bz7w\" (UniqueName: \"kubernetes.io/projected/20342186-5a8f-4b34-b29c-2b32625822a3-kube-api-access-8bz7w\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.808412 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bz7w\" (UniqueName: \"kubernetes.io/projected/20342186-5a8f-4b34-b29c-2b32625822a3-kube-api-access-8bz7w\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.808479 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-utilities\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.808524 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-catalog-content\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.809058 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-utilities\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.809082 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-catalog-content\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:48 crc kubenswrapper[4761]: I1201 10:44:48.829668 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bz7w\" (UniqueName: \"kubernetes.io/projected/20342186-5a8f-4b34-b29c-2b32625822a3-kube-api-access-8bz7w\") pod \"certified-operators-c5bbw\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:49 crc kubenswrapper[4761]: I1201 10:44:49.008674 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:49 crc kubenswrapper[4761]: I1201 10:44:49.754410 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5bbw"] Dec 01 10:44:49 crc kubenswrapper[4761]: W1201 10:44:49.759048 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20342186_5a8f_4b34_b29c_2b32625822a3.slice/crio-d3c9be7be3cfaff617fb96ec12e96ebdb17e21257147e14ff8b2a75b4d3b1e37 WatchSource:0}: Error finding container d3c9be7be3cfaff617fb96ec12e96ebdb17e21257147e14ff8b2a75b4d3b1e37: Status 404 returned error can't find the container with id d3c9be7be3cfaff617fb96ec12e96ebdb17e21257147e14ff8b2a75b4d3b1e37 Dec 01 10:44:49 crc kubenswrapper[4761]: I1201 10:44:49.790590 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5bbw" event={"ID":"20342186-5a8f-4b34-b29c-2b32625822a3","Type":"ContainerStarted","Data":"d3c9be7be3cfaff617fb96ec12e96ebdb17e21257147e14ff8b2a75b4d3b1e37"} Dec 01 10:44:49 crc kubenswrapper[4761]: I1201 10:44:49.792771 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" event={"ID":"635c1195-66ca-4595-8f7d-cb66e37db30f","Type":"ContainerStarted","Data":"a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20"} Dec 01 10:44:49 crc kubenswrapper[4761]: I1201 10:44:49.793219 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:44:49 crc kubenswrapper[4761]: I1201 10:44:49.817758 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" podStartSLOduration=2.020652259 podStartE2EDuration="7.817740588s" podCreationTimestamp="2025-12-01 10:44:42 +0000 UTC" firstStartedPulling="2025-12-01 10:44:43.597340263 +0000 UTC m=+822.901098907" lastFinishedPulling="2025-12-01 10:44:49.394428612 +0000 UTC m=+828.698187236" observedRunningTime="2025-12-01 10:44:49.816102732 +0000 UTC m=+829.119861356" watchObservedRunningTime="2025-12-01 10:44:49.817740588 +0000 UTC m=+829.121499202" Dec 01 10:44:50 crc kubenswrapper[4761]: I1201 10:44:50.802775 4761 generic.go:334] "Generic (PLEG): container finished" podID="20342186-5a8f-4b34-b29c-2b32625822a3" containerID="b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d" exitCode=0 Dec 01 10:44:50 crc kubenswrapper[4761]: I1201 10:44:50.802897 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5bbw" event={"ID":"20342186-5a8f-4b34-b29c-2b32625822a3","Type":"ContainerDied","Data":"b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d"} Dec 01 10:44:52 crc kubenswrapper[4761]: I1201 10:44:52.817205 4761 generic.go:334] "Generic (PLEG): container finished" podID="20342186-5a8f-4b34-b29c-2b32625822a3" containerID="6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307" exitCode=0 Dec 01 10:44:52 crc kubenswrapper[4761]: I1201 10:44:52.817282 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5bbw" event={"ID":"20342186-5a8f-4b34-b29c-2b32625822a3","Type":"ContainerDied","Data":"6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307"} Dec 01 10:44:53 crc kubenswrapper[4761]: I1201 10:44:53.835852 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5bbw" event={"ID":"20342186-5a8f-4b34-b29c-2b32625822a3","Type":"ContainerStarted","Data":"a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea"} Dec 01 10:44:53 crc kubenswrapper[4761]: I1201 10:44:53.855276 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c5bbw" podStartSLOduration=3.177477234 podStartE2EDuration="5.855253597s" podCreationTimestamp="2025-12-01 10:44:48 +0000 UTC" firstStartedPulling="2025-12-01 10:44:50.804645265 +0000 UTC m=+830.108403899" lastFinishedPulling="2025-12-01 10:44:53.482421628 +0000 UTC m=+832.786180262" observedRunningTime="2025-12-01 10:44:53.85108263 +0000 UTC m=+833.154841274" watchObservedRunningTime="2025-12-01 10:44:53.855253597 +0000 UTC m=+833.159012251" Dec 01 10:44:59 crc kubenswrapper[4761]: I1201 10:44:59.009759 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:59 crc kubenswrapper[4761]: I1201 10:44:59.010111 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:59 crc kubenswrapper[4761]: I1201 10:44:59.082837 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:44:59 crc kubenswrapper[4761]: I1201 10:44:59.934255 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.148849 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr"] Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.149600 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.152390 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.154132 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.172514 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr"] Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.285369 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x79dg\" (UniqueName: \"kubernetes.io/projected/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-kube-api-access-x79dg\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.285442 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-secret-volume\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.285462 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-config-volume\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.387432 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x79dg\" (UniqueName: \"kubernetes.io/projected/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-kube-api-access-x79dg\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.387596 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-secret-volume\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.387632 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-config-volume\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.390173 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-config-volume\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.394447 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-secret-volume\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.406807 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x79dg\" (UniqueName: \"kubernetes.io/projected/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-kube-api-access-x79dg\") pod \"collect-profiles-29409765-gdkxr\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.472081 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.698297 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr"] Dec 01 10:45:00 crc kubenswrapper[4761]: I1201 10:45:00.891761 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" event={"ID":"f068dbfd-da6c-48f4-8a01-a15dc3bdc818","Type":"ContainerStarted","Data":"8000d1f6b4d0559120212ba0c65238868ff6b71d39530afb7ea929176f8c49b3"} Dec 01 10:45:01 crc kubenswrapper[4761]: I1201 10:45:01.450528 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c5bbw"] Dec 01 10:45:01 crc kubenswrapper[4761]: I1201 10:45:01.898731 4761 generic.go:334] "Generic (PLEG): container finished" podID="f068dbfd-da6c-48f4-8a01-a15dc3bdc818" containerID="8457ae7cd3ba8ad17ba2097040a523d9907ef285feda0932e80d4853124fc374" exitCode=0 Dec 01 10:45:01 crc kubenswrapper[4761]: I1201 10:45:01.898820 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" event={"ID":"f068dbfd-da6c-48f4-8a01-a15dc3bdc818","Type":"ContainerDied","Data":"8457ae7cd3ba8ad17ba2097040a523d9907ef285feda0932e80d4853124fc374"} Dec 01 10:45:01 crc kubenswrapper[4761]: I1201 10:45:01.898907 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c5bbw" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" containerName="registry-server" containerID="cri-o://a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea" gracePeriod=2 Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.277340 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.416346 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-utilities\") pod \"20342186-5a8f-4b34-b29c-2b32625822a3\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.416465 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-catalog-content\") pod \"20342186-5a8f-4b34-b29c-2b32625822a3\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.416617 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bz7w\" (UniqueName: \"kubernetes.io/projected/20342186-5a8f-4b34-b29c-2b32625822a3-kube-api-access-8bz7w\") pod \"20342186-5a8f-4b34-b29c-2b32625822a3\" (UID: \"20342186-5a8f-4b34-b29c-2b32625822a3\") " Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.421435 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20342186-5a8f-4b34-b29c-2b32625822a3-kube-api-access-8bz7w" (OuterVolumeSpecName: "kube-api-access-8bz7w") pod "20342186-5a8f-4b34-b29c-2b32625822a3" (UID: "20342186-5a8f-4b34-b29c-2b32625822a3"). InnerVolumeSpecName "kube-api-access-8bz7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.427611 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-utilities" (OuterVolumeSpecName: "utilities") pod "20342186-5a8f-4b34-b29c-2b32625822a3" (UID: "20342186-5a8f-4b34-b29c-2b32625822a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.474218 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20342186-5a8f-4b34-b29c-2b32625822a3" (UID: "20342186-5a8f-4b34-b29c-2b32625822a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.517824 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bz7w\" (UniqueName: \"kubernetes.io/projected/20342186-5a8f-4b34-b29c-2b32625822a3-kube-api-access-8bz7w\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.517865 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.517877 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20342186-5a8f-4b34-b29c-2b32625822a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.905291 4761 generic.go:334] "Generic (PLEG): container finished" podID="20342186-5a8f-4b34-b29c-2b32625822a3" containerID="a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea" exitCode=0 Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.905331 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5bbw" event={"ID":"20342186-5a8f-4b34-b29c-2b32625822a3","Type":"ContainerDied","Data":"a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea"} Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.905373 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5bbw" event={"ID":"20342186-5a8f-4b34-b29c-2b32625822a3","Type":"ContainerDied","Data":"d3c9be7be3cfaff617fb96ec12e96ebdb17e21257147e14ff8b2a75b4d3b1e37"} Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.905378 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5bbw" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.905393 4761 scope.go:117] "RemoveContainer" containerID="a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.932266 4761 scope.go:117] "RemoveContainer" containerID="6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.935692 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c5bbw"] Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.940229 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c5bbw"] Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.949939 4761 scope.go:117] "RemoveContainer" containerID="b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.971508 4761 scope.go:117] "RemoveContainer" containerID="a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea" Dec 01 10:45:02 crc kubenswrapper[4761]: E1201 10:45:02.972143 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea\": container with ID starting with a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea not found: ID does not exist" containerID="a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.972175 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea"} err="failed to get container status \"a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea\": rpc error: code = NotFound desc = could not find container \"a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea\": container with ID starting with a2dba4dde0999be5e9a0a726729796da504ece8e9f297f39e39ccfb8f5ac6dea not found: ID does not exist" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.972194 4761 scope.go:117] "RemoveContainer" containerID="6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307" Dec 01 10:45:02 crc kubenswrapper[4761]: E1201 10:45:02.972455 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307\": container with ID starting with 6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307 not found: ID does not exist" containerID="6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.972474 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307"} err="failed to get container status \"6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307\": rpc error: code = NotFound desc = could not find container \"6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307\": container with ID starting with 6ea523c21ce6fa971670a529fb5f40e4af29fca596d0ad5b6bfc8836fbbb0307 not found: ID does not exist" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.972485 4761 scope.go:117] "RemoveContainer" containerID="b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d" Dec 01 10:45:02 crc kubenswrapper[4761]: E1201 10:45:02.972683 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d\": container with ID starting with b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d not found: ID does not exist" containerID="b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d" Dec 01 10:45:02 crc kubenswrapper[4761]: I1201 10:45:02.972712 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d"} err="failed to get container status \"b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d\": rpc error: code = NotFound desc = could not find container \"b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d\": container with ID starting with b8a06335fb495016959bd7002d405e9218e8cf207a6e2fd87f70a0466238417d not found: ID does not exist" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.120821 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.142299 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" path="/var/lib/kubelet/pods/20342186-5a8f-4b34-b29c-2b32625822a3/volumes" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.210300 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.327973 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-secret-volume\") pod \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.328065 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-config-volume\") pod \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.328091 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x79dg\" (UniqueName: \"kubernetes.io/projected/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-kube-api-access-x79dg\") pod \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\" (UID: \"f068dbfd-da6c-48f4-8a01-a15dc3bdc818\") " Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.328805 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-config-volume" (OuterVolumeSpecName: "config-volume") pod "f068dbfd-da6c-48f4-8a01-a15dc3bdc818" (UID: "f068dbfd-da6c-48f4-8a01-a15dc3bdc818"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.330953 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f068dbfd-da6c-48f4-8a01-a15dc3bdc818" (UID: "f068dbfd-da6c-48f4-8a01-a15dc3bdc818"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.330961 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-kube-api-access-x79dg" (OuterVolumeSpecName: "kube-api-access-x79dg") pod "f068dbfd-da6c-48f4-8a01-a15dc3bdc818" (UID: "f068dbfd-da6c-48f4-8a01-a15dc3bdc818"). InnerVolumeSpecName "kube-api-access-x79dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.429636 4761 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.429701 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x79dg\" (UniqueName: \"kubernetes.io/projected/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-kube-api-access-x79dg\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.429730 4761 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068dbfd-da6c-48f4-8a01-a15dc3bdc818-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.850080 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.850176 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.850241 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.851052 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3eb417125a9051f5c4c312a6fe5fbfd28525e926ddf81a026e3b1bb704152208"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.851152 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://3eb417125a9051f5c4c312a6fe5fbfd28525e926ddf81a026e3b1bb704152208" gracePeriod=600 Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.914028 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" event={"ID":"f068dbfd-da6c-48f4-8a01-a15dc3bdc818","Type":"ContainerDied","Data":"8000d1f6b4d0559120212ba0c65238868ff6b71d39530afb7ea929176f8c49b3"} Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.914065 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-gdkxr" Dec 01 10:45:03 crc kubenswrapper[4761]: I1201 10:45:03.914085 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8000d1f6b4d0559120212ba0c65238868ff6b71d39530afb7ea929176f8c49b3" Dec 01 10:45:04 crc kubenswrapper[4761]: I1201 10:45:04.928099 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="3eb417125a9051f5c4c312a6fe5fbfd28525e926ddf81a026e3b1bb704152208" exitCode=0 Dec 01 10:45:04 crc kubenswrapper[4761]: I1201 10:45:04.928181 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"3eb417125a9051f5c4c312a6fe5fbfd28525e926ddf81a026e3b1bb704152208"} Dec 01 10:45:04 crc kubenswrapper[4761]: I1201 10:45:04.928612 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"d30d5344481323b43a7d255c5c2b5f71119019ddc6b979360df65b87253e34d5"} Dec 01 10:45:04 crc kubenswrapper[4761]: I1201 10:45:04.928653 4761 scope.go:117] "RemoveContainer" containerID="158963cf7c7332677495f8902b02e1b832dfd26ac99929eec34f87750405cba2" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.661515 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-mm28x"] Dec 01 10:45:08 crc kubenswrapper[4761]: E1201 10:45:08.662180 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f068dbfd-da6c-48f4-8a01-a15dc3bdc818" containerName="collect-profiles" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.662193 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f068dbfd-da6c-48f4-8a01-a15dc3bdc818" containerName="collect-profiles" Dec 01 10:45:08 crc kubenswrapper[4761]: E1201 10:45:08.662205 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" containerName="extract-utilities" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.662211 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" containerName="extract-utilities" Dec 01 10:45:08 crc kubenswrapper[4761]: E1201 10:45:08.662238 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" containerName="extract-content" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.662244 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" containerName="extract-content" Dec 01 10:45:08 crc kubenswrapper[4761]: E1201 10:45:08.662253 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" containerName="registry-server" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.662260 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" containerName="registry-server" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.662360 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f068dbfd-da6c-48f4-8a01-a15dc3bdc818" containerName="collect-profiles" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.662368 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20342186-5a8f-4b34-b29c-2b32625822a3" containerName="registry-server" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.662799 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mm28x" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.666628 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-fb9f9" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.677963 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-mm28x"] Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.800101 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7w7x\" (UniqueName: \"kubernetes.io/projected/f0600e7b-c33c-4c3f-b7f4-920eb1195251-kube-api-access-r7w7x\") pod \"infra-operator-index-mm28x\" (UID: \"f0600e7b-c33c-4c3f-b7f4-920eb1195251\") " pod="openstack-operators/infra-operator-index-mm28x" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.900769 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7w7x\" (UniqueName: \"kubernetes.io/projected/f0600e7b-c33c-4c3f-b7f4-920eb1195251-kube-api-access-r7w7x\") pod \"infra-operator-index-mm28x\" (UID: \"f0600e7b-c33c-4c3f-b7f4-920eb1195251\") " pod="openstack-operators/infra-operator-index-mm28x" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.939071 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7w7x\" (UniqueName: \"kubernetes.io/projected/f0600e7b-c33c-4c3f-b7f4-920eb1195251-kube-api-access-r7w7x\") pod \"infra-operator-index-mm28x\" (UID: \"f0600e7b-c33c-4c3f-b7f4-920eb1195251\") " pod="openstack-operators/infra-operator-index-mm28x" Dec 01 10:45:08 crc kubenswrapper[4761]: I1201 10:45:08.986477 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mm28x" Dec 01 10:45:09 crc kubenswrapper[4761]: I1201 10:45:09.439001 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-mm28x"] Dec 01 10:45:09 crc kubenswrapper[4761]: I1201 10:45:09.961777 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mm28x" event={"ID":"f0600e7b-c33c-4c3f-b7f4-920eb1195251","Type":"ContainerStarted","Data":"dd79a3070f9a68f06e5ad9a14194a9bde964c6e2ec818e064c0b6f211e1361e7"} Dec 01 10:45:10 crc kubenswrapper[4761]: I1201 10:45:10.972095 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mm28x" event={"ID":"f0600e7b-c33c-4c3f-b7f4-920eb1195251","Type":"ContainerStarted","Data":"b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0"} Dec 01 10:45:11 crc kubenswrapper[4761]: I1201 10:45:11.002399 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-mm28x" podStartSLOduration=1.85534548 podStartE2EDuration="3.002374092s" podCreationTimestamp="2025-12-01 10:45:08 +0000 UTC" firstStartedPulling="2025-12-01 10:45:09.4502038 +0000 UTC m=+848.753962424" lastFinishedPulling="2025-12-01 10:45:10.597232412 +0000 UTC m=+849.900991036" observedRunningTime="2025-12-01 10:45:10.992101077 +0000 UTC m=+850.295859741" watchObservedRunningTime="2025-12-01 10:45:11.002374092 +0000 UTC m=+850.306132756" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.052634 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-mm28x"] Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.053136 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-mm28x" podUID="f0600e7b-c33c-4c3f-b7f4-920eb1195251" containerName="registry-server" containerID="cri-o://b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0" gracePeriod=2 Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.486300 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mm28x" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.663961 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-vgg5w"] Dec 01 10:45:13 crc kubenswrapper[4761]: E1201 10:45:13.664843 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0600e7b-c33c-4c3f-b7f4-920eb1195251" containerName="registry-server" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.664871 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0600e7b-c33c-4c3f-b7f4-920eb1195251" containerName="registry-server" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.665167 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0600e7b-c33c-4c3f-b7f4-920eb1195251" containerName="registry-server" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.666086 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.673908 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-vgg5w"] Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.678521 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7w7x\" (UniqueName: \"kubernetes.io/projected/f0600e7b-c33c-4c3f-b7f4-920eb1195251-kube-api-access-r7w7x\") pod \"f0600e7b-c33c-4c3f-b7f4-920eb1195251\" (UID: \"f0600e7b-c33c-4c3f-b7f4-920eb1195251\") " Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.679245 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grb6x\" (UniqueName: \"kubernetes.io/projected/66d6a565-82b4-42d3-b803-9ff143c8a8bc-kube-api-access-grb6x\") pod \"infra-operator-index-vgg5w\" (UID: \"66d6a565-82b4-42d3-b803-9ff143c8a8bc\") " pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.689008 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0600e7b-c33c-4c3f-b7f4-920eb1195251-kube-api-access-r7w7x" (OuterVolumeSpecName: "kube-api-access-r7w7x") pod "f0600e7b-c33c-4c3f-b7f4-920eb1195251" (UID: "f0600e7b-c33c-4c3f-b7f4-920eb1195251"). InnerVolumeSpecName "kube-api-access-r7w7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.780428 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grb6x\" (UniqueName: \"kubernetes.io/projected/66d6a565-82b4-42d3-b803-9ff143c8a8bc-kube-api-access-grb6x\") pod \"infra-operator-index-vgg5w\" (UID: \"66d6a565-82b4-42d3-b803-9ff143c8a8bc\") " pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.780595 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7w7x\" (UniqueName: \"kubernetes.io/projected/f0600e7b-c33c-4c3f-b7f4-920eb1195251-kube-api-access-r7w7x\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.798684 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grb6x\" (UniqueName: \"kubernetes.io/projected/66d6a565-82b4-42d3-b803-9ff143c8a8bc-kube-api-access-grb6x\") pod \"infra-operator-index-vgg5w\" (UID: \"66d6a565-82b4-42d3-b803-9ff143c8a8bc\") " pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.990196 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.997140 4761 generic.go:334] "Generic (PLEG): container finished" podID="f0600e7b-c33c-4c3f-b7f4-920eb1195251" containerID="b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0" exitCode=0 Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.997202 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mm28x" event={"ID":"f0600e7b-c33c-4c3f-b7f4-920eb1195251","Type":"ContainerDied","Data":"b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0"} Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.997248 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mm28x" event={"ID":"f0600e7b-c33c-4c3f-b7f4-920eb1195251","Type":"ContainerDied","Data":"dd79a3070f9a68f06e5ad9a14194a9bde964c6e2ec818e064c0b6f211e1361e7"} Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.997211 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mm28x" Dec 01 10:45:13 crc kubenswrapper[4761]: I1201 10:45:13.997276 4761 scope.go:117] "RemoveContainer" containerID="b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0" Dec 01 10:45:14 crc kubenswrapper[4761]: I1201 10:45:14.042304 4761 scope.go:117] "RemoveContainer" containerID="b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0" Dec 01 10:45:14 crc kubenswrapper[4761]: E1201 10:45:14.043190 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0\": container with ID starting with b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0 not found: ID does not exist" containerID="b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0" Dec 01 10:45:14 crc kubenswrapper[4761]: I1201 10:45:14.043274 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0"} err="failed to get container status \"b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0\": rpc error: code = NotFound desc = could not find container \"b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0\": container with ID starting with b6c40eac1d661aea64fbb93e23ed9c56b4901c068d203f3c34a90bfc42220ba0 not found: ID does not exist" Dec 01 10:45:14 crc kubenswrapper[4761]: I1201 10:45:14.049172 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-mm28x"] Dec 01 10:45:14 crc kubenswrapper[4761]: I1201 10:45:14.054668 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-mm28x"] Dec 01 10:45:14 crc kubenswrapper[4761]: I1201 10:45:14.269053 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-vgg5w"] Dec 01 10:45:15 crc kubenswrapper[4761]: I1201 10:45:15.012749 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-vgg5w" event={"ID":"66d6a565-82b4-42d3-b803-9ff143c8a8bc","Type":"ContainerStarted","Data":"a450c237459da4afcffc39341017f6ac90adfaa578a6d8a8bed88bbc488d8154"} Dec 01 10:45:15 crc kubenswrapper[4761]: I1201 10:45:15.144140 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0600e7b-c33c-4c3f-b7f4-920eb1195251" path="/var/lib/kubelet/pods/f0600e7b-c33c-4c3f-b7f4-920eb1195251/volumes" Dec 01 10:45:16 crc kubenswrapper[4761]: I1201 10:45:16.025069 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-vgg5w" event={"ID":"66d6a565-82b4-42d3-b803-9ff143c8a8bc","Type":"ContainerStarted","Data":"5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7"} Dec 01 10:45:16 crc kubenswrapper[4761]: I1201 10:45:16.070627 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-vgg5w" podStartSLOduration=2.267906134 podStartE2EDuration="3.070592611s" podCreationTimestamp="2025-12-01 10:45:13 +0000 UTC" firstStartedPulling="2025-12-01 10:45:14.298283045 +0000 UTC m=+853.602041669" lastFinishedPulling="2025-12-01 10:45:15.100969492 +0000 UTC m=+854.404728146" observedRunningTime="2025-12-01 10:45:16.0593495 +0000 UTC m=+855.363108164" watchObservedRunningTime="2025-12-01 10:45:16.070592611 +0000 UTC m=+855.374351275" Dec 01 10:45:24 crc kubenswrapper[4761]: I1201 10:45:24.193002 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:24 crc kubenswrapper[4761]: I1201 10:45:24.193584 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:24 crc kubenswrapper[4761]: I1201 10:45:24.231119 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:25 crc kubenswrapper[4761]: I1201 10:45:25.252599 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.728213 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr"] Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.730124 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.735050 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8w9gk" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.742701 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr"] Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.834031 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.834135 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.834201 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqbt\" (UniqueName: \"kubernetes.io/projected/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-kube-api-access-kxqbt\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.935159 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqbt\" (UniqueName: \"kubernetes.io/projected/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-kube-api-access-kxqbt\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.935691 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.935966 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.936656 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.936890 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:26 crc kubenswrapper[4761]: I1201 10:45:26.954638 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqbt\" (UniqueName: \"kubernetes.io/projected/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-kube-api-access-kxqbt\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:27 crc kubenswrapper[4761]: I1201 10:45:27.051784 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:27 crc kubenswrapper[4761]: I1201 10:45:27.449275 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr"] Dec 01 10:45:27 crc kubenswrapper[4761]: W1201 10:45:27.462711 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73f57fa1_0cb4_4df5_8675_7789b6e83ef7.slice/crio-31067c714c8c06c215d9f35127a9631f5314392ffbdaed692e37ab320e11947a WatchSource:0}: Error finding container 31067c714c8c06c215d9f35127a9631f5314392ffbdaed692e37ab320e11947a: Status 404 returned error can't find the container with id 31067c714c8c06c215d9f35127a9631f5314392ffbdaed692e37ab320e11947a Dec 01 10:45:28 crc kubenswrapper[4761]: I1201 10:45:28.230984 4761 generic.go:334] "Generic (PLEG): container finished" podID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerID="9c09540d8e0bd15d479bf5b04e66d385edb8f004ff3375ea3ecaec4b21a1f9a3" exitCode=0 Dec 01 10:45:28 crc kubenswrapper[4761]: I1201 10:45:28.231025 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" event={"ID":"73f57fa1-0cb4-4df5-8675-7789b6e83ef7","Type":"ContainerDied","Data":"9c09540d8e0bd15d479bf5b04e66d385edb8f004ff3375ea3ecaec4b21a1f9a3"} Dec 01 10:45:28 crc kubenswrapper[4761]: I1201 10:45:28.231243 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" event={"ID":"73f57fa1-0cb4-4df5-8675-7789b6e83ef7","Type":"ContainerStarted","Data":"31067c714c8c06c215d9f35127a9631f5314392ffbdaed692e37ab320e11947a"} Dec 01 10:45:30 crc kubenswrapper[4761]: I1201 10:45:30.251996 4761 generic.go:334] "Generic (PLEG): container finished" podID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerID="7fb4b133070a8796a792130396df9c4d5bb21afea337b75723314c53b582ac41" exitCode=0 Dec 01 10:45:30 crc kubenswrapper[4761]: I1201 10:45:30.252038 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" event={"ID":"73f57fa1-0cb4-4df5-8675-7789b6e83ef7","Type":"ContainerDied","Data":"7fb4b133070a8796a792130396df9c4d5bb21afea337b75723314c53b582ac41"} Dec 01 10:45:31 crc kubenswrapper[4761]: I1201 10:45:31.266688 4761 generic.go:334] "Generic (PLEG): container finished" podID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerID="0ea1e4467b349a028e34592014f2f13e8aae19f3b904182cd25d19d762464255" exitCode=0 Dec 01 10:45:31 crc kubenswrapper[4761]: I1201 10:45:31.266755 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" event={"ID":"73f57fa1-0cb4-4df5-8675-7789b6e83ef7","Type":"ContainerDied","Data":"0ea1e4467b349a028e34592014f2f13e8aae19f3b904182cd25d19d762464255"} Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.613655 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.711047 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxqbt\" (UniqueName: \"kubernetes.io/projected/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-kube-api-access-kxqbt\") pod \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.711249 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-bundle\") pod \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.711342 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-util\") pod \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\" (UID: \"73f57fa1-0cb4-4df5-8675-7789b6e83ef7\") " Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.713358 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-bundle" (OuterVolumeSpecName: "bundle") pod "73f57fa1-0cb4-4df5-8675-7789b6e83ef7" (UID: "73f57fa1-0cb4-4df5-8675-7789b6e83ef7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.721749 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-kube-api-access-kxqbt" (OuterVolumeSpecName: "kube-api-access-kxqbt") pod "73f57fa1-0cb4-4df5-8675-7789b6e83ef7" (UID: "73f57fa1-0cb4-4df5-8675-7789b6e83ef7"). InnerVolumeSpecName "kube-api-access-kxqbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.813296 4761 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.813338 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxqbt\" (UniqueName: \"kubernetes.io/projected/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-kube-api-access-kxqbt\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:32 crc kubenswrapper[4761]: I1201 10:45:32.978451 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-util" (OuterVolumeSpecName: "util") pod "73f57fa1-0cb4-4df5-8675-7789b6e83ef7" (UID: "73f57fa1-0cb4-4df5-8675-7789b6e83ef7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:45:33 crc kubenswrapper[4761]: I1201 10:45:33.016331 4761 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73f57fa1-0cb4-4df5-8675-7789b6e83ef7-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:33 crc kubenswrapper[4761]: I1201 10:45:33.284156 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" event={"ID":"73f57fa1-0cb4-4df5-8675-7789b6e83ef7","Type":"ContainerDied","Data":"31067c714c8c06c215d9f35127a9631f5314392ffbdaed692e37ab320e11947a"} Dec 01 10:45:33 crc kubenswrapper[4761]: I1201 10:45:33.284230 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr" Dec 01 10:45:33 crc kubenswrapper[4761]: I1201 10:45:33.284234 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31067c714c8c06c215d9f35127a9631f5314392ffbdaed692e37ab320e11947a" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.706655 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl"] Dec 01 10:45:39 crc kubenswrapper[4761]: E1201 10:45:39.707370 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerName="util" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.707383 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerName="util" Dec 01 10:45:39 crc kubenswrapper[4761]: E1201 10:45:39.707396 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerName="extract" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.707402 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerName="extract" Dec 01 10:45:39 crc kubenswrapper[4761]: E1201 10:45:39.707411 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerName="pull" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.707417 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerName="pull" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.707521 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" containerName="extract" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.708149 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.710216 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4d2zz" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.712758 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.726680 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl"] Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.805864 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbk4d\" (UniqueName: \"kubernetes.io/projected/e1a6426b-c4ef-4874-8f48-a59d830ae08d-kube-api-access-xbk4d\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.806032 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-webhook-cert\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.806146 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-apiservice-cert\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.907759 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-apiservice-cert\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.907892 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbk4d\" (UniqueName: \"kubernetes.io/projected/e1a6426b-c4ef-4874-8f48-a59d830ae08d-kube-api-access-xbk4d\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.907921 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-webhook-cert\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.923669 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-apiservice-cert\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.928339 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-webhook-cert\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:39 crc kubenswrapper[4761]: I1201 10:45:39.960454 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbk4d\" (UniqueName: \"kubernetes.io/projected/e1a6426b-c4ef-4874-8f48-a59d830ae08d-kube-api-access-xbk4d\") pod \"infra-operator-controller-manager-67cf567c5-99jtl\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:40 crc kubenswrapper[4761]: I1201 10:45:40.064532 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:40 crc kubenswrapper[4761]: I1201 10:45:40.352086 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl"] Dec 01 10:45:40 crc kubenswrapper[4761]: W1201 10:45:40.356660 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a6426b_c4ef_4874_8f48_a59d830ae08d.slice/crio-fe92fd9f609911f8d1b987f9c4037b864e1288bf46fcb9a82ab26d5ec33ac184 WatchSource:0}: Error finding container fe92fd9f609911f8d1b987f9c4037b864e1288bf46fcb9a82ab26d5ec33ac184: Status 404 returned error can't find the container with id fe92fd9f609911f8d1b987f9c4037b864e1288bf46fcb9a82ab26d5ec33ac184 Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.202889 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.206519 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.210577 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.211502 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.211838 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-4crrg" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.212053 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.212219 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.221097 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.229208 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.230188 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.240235 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.244954 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.245937 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.270041 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336578 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336621 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-default\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336638 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kolla-config\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336663 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgdj8\" (UniqueName: \"kubernetes.io/projected/f6d62685-3430-4fba-b0ca-34ae3169f562-kube-api-access-zgdj8\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336682 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kube-api-access-6rwvm\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336701 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336715 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336731 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-generated\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336748 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336762 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-operator-scripts\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336871 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336927 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.336970 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-default\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.337015 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.337033 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25q2t\" (UniqueName: \"kubernetes.io/projected/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kube-api-access-25q2t\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.337095 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.337118 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-kolla-config\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.337144 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.361322 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" event={"ID":"e1a6426b-c4ef-4874-8f48-a59d830ae08d","Type":"ContainerStarted","Data":"fe92fd9f609911f8d1b987f9c4037b864e1288bf46fcb9a82ab26d5ec33ac184"} Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438265 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-default\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438313 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438328 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25q2t\" (UniqueName: \"kubernetes.io/projected/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kube-api-access-25q2t\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438361 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438380 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-kolla-config\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438397 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438447 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438465 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-default\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438478 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kolla-config\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438495 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgdj8\" (UniqueName: \"kubernetes.io/projected/f6d62685-3430-4fba-b0ca-34ae3169f562-kube-api-access-zgdj8\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438512 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kube-api-access-6rwvm\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438528 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438543 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438571 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-generated\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438600 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438615 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-operator-scripts\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438645 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438663 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.438990 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.439233 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.439716 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kolla-config\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.439912 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-generated\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.439978 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.440011 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.439205 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.440525 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.440570 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-default\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.440664 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-kolla-config\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.440817 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-default\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.441260 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.441968 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.442242 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.442947 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-operator-scripts\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.454858 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgdj8\" (UniqueName: \"kubernetes.io/projected/f6d62685-3430-4fba-b0ca-34ae3169f562-kube-api-access-zgdj8\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.455915 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.457139 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.457253 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.457811 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kube-api-access-6rwvm\") pod \"openstack-galera-0\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.466540 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25q2t\" (UniqueName: \"kubernetes.io/projected/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kube-api-access-25q2t\") pod \"openstack-galera-1\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.524929 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.553949 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:45:41 crc kubenswrapper[4761]: I1201 10:45:41.566418 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:45:42 crc kubenswrapper[4761]: I1201 10:45:42.370530 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" event={"ID":"e1a6426b-c4ef-4874-8f48-a59d830ae08d","Type":"ContainerStarted","Data":"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6"} Dec 01 10:45:42 crc kubenswrapper[4761]: I1201 10:45:42.636403 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 01 10:45:42 crc kubenswrapper[4761]: W1201 10:45:42.640462 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ebdf60a_b95f_4443_9bcc_452c3d2da2ec.slice/crio-ebd2c2235ca97bdba671327e9120d44524e621192083fda8f3f10608bc95eb6a WatchSource:0}: Error finding container ebd2c2235ca97bdba671327e9120d44524e621192083fda8f3f10608bc95eb6a: Status 404 returned error can't find the container with id ebd2c2235ca97bdba671327e9120d44524e621192083fda8f3f10608bc95eb6a Dec 01 10:45:42 crc kubenswrapper[4761]: I1201 10:45:42.688606 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 01 10:45:42 crc kubenswrapper[4761]: W1201 10:45:42.692961 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7040d73f_f2e1_4a80_a719_8a2f8ff10f7e.slice/crio-28bc08c5d8d9ebf863a0d40bf98b87e48aeb4862f6850ab63037bf059185dca9 WatchSource:0}: Error finding container 28bc08c5d8d9ebf863a0d40bf98b87e48aeb4862f6850ab63037bf059185dca9: Status 404 returned error can't find the container with id 28bc08c5d8d9ebf863a0d40bf98b87e48aeb4862f6850ab63037bf059185dca9 Dec 01 10:45:42 crc kubenswrapper[4761]: W1201 10:45:42.694185 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d62685_3430_4fba_b0ca_34ae3169f562.slice/crio-bfa39049da02d22abd2bc2bf5bddcfdee0a4e64f0955edd0fe2df67029cef680 WatchSource:0}: Error finding container bfa39049da02d22abd2bc2bf5bddcfdee0a4e64f0955edd0fe2df67029cef680: Status 404 returned error can't find the container with id bfa39049da02d22abd2bc2bf5bddcfdee0a4e64f0955edd0fe2df67029cef680 Dec 01 10:45:42 crc kubenswrapper[4761]: I1201 10:45:42.697784 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 01 10:45:43 crc kubenswrapper[4761]: I1201 10:45:43.388240 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f6d62685-3430-4fba-b0ca-34ae3169f562","Type":"ContainerStarted","Data":"bfa39049da02d22abd2bc2bf5bddcfdee0a4e64f0955edd0fe2df67029cef680"} Dec 01 10:45:43 crc kubenswrapper[4761]: I1201 10:45:43.390518 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e","Type":"ContainerStarted","Data":"28bc08c5d8d9ebf863a0d40bf98b87e48aeb4862f6850ab63037bf059185dca9"} Dec 01 10:45:43 crc kubenswrapper[4761]: I1201 10:45:43.392054 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec","Type":"ContainerStarted","Data":"ebd2c2235ca97bdba671327e9120d44524e621192083fda8f3f10608bc95eb6a"} Dec 01 10:45:48 crc kubenswrapper[4761]: E1201 10:45:48.938223 4761 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading blob sha256:9edb696970a5d944aa0a013096ed565e4092e9a751f383a80fdde57bd71155e3: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/9e/9edb696970a5d944aa0a013096ed565e4092e9a751f383a80fdde57bd71155e3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T104543Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=6374bf00967b9fc8ee5fc0cc25ebf4299bc30190566319e7199e1c720aae2560®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-mariadb&akamai_signature=exp=1764586843~hmac=ada39c373b5e45a162f697a8aa4710d67bc43168e9b12e90eb20e6910064ed1d\": EOF" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce" Dec 01 10:45:48 crc kubenswrapper[4761]: E1201 10:45:48.938843 4761 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rwvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_glance-kuttl-tests(7ebdf60a-b95f-4443-9bcc-452c3d2da2ec): ErrImagePull: reading blob sha256:9edb696970a5d944aa0a013096ed565e4092e9a751f383a80fdde57bd71155e3: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/9e/9edb696970a5d944aa0a013096ed565e4092e9a751f383a80fdde57bd71155e3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T104543Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=6374bf00967b9fc8ee5fc0cc25ebf4299bc30190566319e7199e1c720aae2560®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-mariadb&akamai_signature=exp=1764586843~hmac=ada39c373b5e45a162f697a8aa4710d67bc43168e9b12e90eb20e6910064ed1d\": EOF" logger="UnhandledError" Dec 01 10:45:48 crc kubenswrapper[4761]: E1201 10:45:48.940020 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"reading blob sha256:9edb696970a5d944aa0a013096ed565e4092e9a751f383a80fdde57bd71155e3: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/9e/9edb696970a5d944aa0a013096ed565e4092e9a751f383a80fdde57bd71155e3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T104543Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=6374bf00967b9fc8ee5fc0cc25ebf4299bc30190566319e7199e1c720aae2560®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-mariadb&akamai_signature=exp=1764586843~hmac=ada39c373b5e45a162f697a8aa4710d67bc43168e9b12e90eb20e6910064ed1d\\\": EOF\"" pod="glance-kuttl-tests/openstack-galera-0" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" Dec 01 10:45:49 crc kubenswrapper[4761]: E1201 10:45:49.431496 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce\\\"\"" pod="glance-kuttl-tests/openstack-galera-0" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" Dec 01 10:45:51 crc kubenswrapper[4761]: I1201 10:45:51.454311 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" event={"ID":"e1a6426b-c4ef-4874-8f48-a59d830ae08d","Type":"ContainerStarted","Data":"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df"} Dec 01 10:45:51 crc kubenswrapper[4761]: I1201 10:45:51.454704 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:51 crc kubenswrapper[4761]: I1201 10:45:51.459984 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:45:51 crc kubenswrapper[4761]: I1201 10:45:51.485212 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" podStartSLOduration=1.658347005 podStartE2EDuration="12.485193599s" podCreationTimestamp="2025-12-01 10:45:39 +0000 UTC" firstStartedPulling="2025-12-01 10:45:40.358354756 +0000 UTC m=+879.662113380" lastFinishedPulling="2025-12-01 10:45:51.18520135 +0000 UTC m=+890.488959974" observedRunningTime="2025-12-01 10:45:51.472975862 +0000 UTC m=+890.776734486" watchObservedRunningTime="2025-12-01 10:45:51.485193599 +0000 UTC m=+890.788952223" Dec 01 10:45:52 crc kubenswrapper[4761]: I1201 10:45:52.463180 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f6d62685-3430-4fba-b0ca-34ae3169f562","Type":"ContainerStarted","Data":"a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0"} Dec 01 10:45:52 crc kubenswrapper[4761]: I1201 10:45:52.465004 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e","Type":"ContainerStarted","Data":"51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9"} Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.490526 4761 generic.go:334] "Generic (PLEG): container finished" podID="f6d62685-3430-4fba-b0ca-34ae3169f562" containerID="a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0" exitCode=0 Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.490639 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f6d62685-3430-4fba-b0ca-34ae3169f562","Type":"ContainerDied","Data":"a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0"} Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.493779 4761 generic.go:334] "Generic (PLEG): container finished" podID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" containerID="51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9" exitCode=0 Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.493805 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e","Type":"ContainerDied","Data":"51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9"} Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.856251 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qms5r"] Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.857180 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.860459 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-xqjxh" Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.864195 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qms5r"] Dec 01 10:45:55 crc kubenswrapper[4761]: I1201 10:45:55.980994 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxdwl\" (UniqueName: \"kubernetes.io/projected/3d9174c7-4f65-40de-941a-4e10bf61eb65-kube-api-access-bxdwl\") pod \"rabbitmq-cluster-operator-index-qms5r\" (UID: \"3d9174c7-4f65-40de-941a-4e10bf61eb65\") " pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:45:56 crc kubenswrapper[4761]: I1201 10:45:56.082888 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxdwl\" (UniqueName: \"kubernetes.io/projected/3d9174c7-4f65-40de-941a-4e10bf61eb65-kube-api-access-bxdwl\") pod \"rabbitmq-cluster-operator-index-qms5r\" (UID: \"3d9174c7-4f65-40de-941a-4e10bf61eb65\") " pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:45:56 crc kubenswrapper[4761]: I1201 10:45:56.101603 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxdwl\" (UniqueName: \"kubernetes.io/projected/3d9174c7-4f65-40de-941a-4e10bf61eb65-kube-api-access-bxdwl\") pod \"rabbitmq-cluster-operator-index-qms5r\" (UID: \"3d9174c7-4f65-40de-941a-4e10bf61eb65\") " pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:45:56 crc kubenswrapper[4761]: I1201 10:45:56.173837 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:45:56 crc kubenswrapper[4761]: I1201 10:45:56.506882 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f6d62685-3430-4fba-b0ca-34ae3169f562","Type":"ContainerStarted","Data":"104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95"} Dec 01 10:45:56 crc kubenswrapper[4761]: I1201 10:45:56.509350 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e","Type":"ContainerStarted","Data":"a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb"} Dec 01 10:45:56 crc kubenswrapper[4761]: I1201 10:45:56.544595 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=7.950558583 podStartE2EDuration="16.544568442s" podCreationTimestamp="2025-12-01 10:45:40 +0000 UTC" firstStartedPulling="2025-12-01 10:45:42.699783774 +0000 UTC m=+882.003542408" lastFinishedPulling="2025-12-01 10:45:51.293793643 +0000 UTC m=+890.597552267" observedRunningTime="2025-12-01 10:45:56.535780087 +0000 UTC m=+895.839538721" watchObservedRunningTime="2025-12-01 10:45:56.544568442 +0000 UTC m=+895.848327126" Dec 01 10:45:56 crc kubenswrapper[4761]: I1201 10:45:56.563278 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=7.98483176 podStartE2EDuration="16.563247472s" podCreationTimestamp="2025-12-01 10:45:40 +0000 UTC" firstStartedPulling="2025-12-01 10:45:42.699022824 +0000 UTC m=+882.002781448" lastFinishedPulling="2025-12-01 10:45:51.277438536 +0000 UTC m=+890.581197160" observedRunningTime="2025-12-01 10:45:56.555806023 +0000 UTC m=+895.859564677" watchObservedRunningTime="2025-12-01 10:45:56.563247472 +0000 UTC m=+895.867006136" Dec 01 10:45:56 crc kubenswrapper[4761]: I1201 10:45:56.648921 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qms5r"] Dec 01 10:45:56 crc kubenswrapper[4761]: W1201 10:45:56.661219 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d9174c7_4f65_40de_941a_4e10bf61eb65.slice/crio-d54cf7814de50f24e03dabf8fa5d76c20af11db2880eb6153241716515d4f2cc WatchSource:0}: Error finding container d54cf7814de50f24e03dabf8fa5d76c20af11db2880eb6153241716515d4f2cc: Status 404 returned error can't find the container with id d54cf7814de50f24e03dabf8fa5d76c20af11db2880eb6153241716515d4f2cc Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.269580 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cs67h"] Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.270902 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.291300 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cs67h"] Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.402204 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-catalog-content\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.402316 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfj4g\" (UniqueName: \"kubernetes.io/projected/ad125cc0-4a82-457d-bd99-bf20a288372d-kube-api-access-bfj4g\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.402360 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-utilities\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.503251 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-utilities\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.503356 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-catalog-content\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.503427 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfj4g\" (UniqueName: \"kubernetes.io/projected/ad125cc0-4a82-457d-bd99-bf20a288372d-kube-api-access-bfj4g\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.505082 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-utilities\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.505503 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-catalog-content\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.517918 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" event={"ID":"3d9174c7-4f65-40de-941a-4e10bf61eb65","Type":"ContainerStarted","Data":"d54cf7814de50f24e03dabf8fa5d76c20af11db2880eb6153241716515d4f2cc"} Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.551500 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfj4g\" (UniqueName: \"kubernetes.io/projected/ad125cc0-4a82-457d-bd99-bf20a288372d-kube-api-access-bfj4g\") pod \"community-operators-cs67h\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:57 crc kubenswrapper[4761]: I1201 10:45:57.597518 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:45:58 crc kubenswrapper[4761]: I1201 10:45:58.147301 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cs67h"] Dec 01 10:45:58 crc kubenswrapper[4761]: W1201 10:45:58.178463 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad125cc0_4a82_457d_bd99_bf20a288372d.slice/crio-b97635ba9454134cb4d88e0b99a2caec6f881e9f3ade28dd983a5558d0f66267 WatchSource:0}: Error finding container b97635ba9454134cb4d88e0b99a2caec6f881e9f3ade28dd983a5558d0f66267: Status 404 returned error can't find the container with id b97635ba9454134cb4d88e0b99a2caec6f881e9f3ade28dd983a5558d0f66267 Dec 01 10:45:58 crc kubenswrapper[4761]: I1201 10:45:58.537693 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cs67h" event={"ID":"ad125cc0-4a82-457d-bd99-bf20a288372d","Type":"ContainerStarted","Data":"b97635ba9454134cb4d88e0b99a2caec6f881e9f3ade28dd983a5558d0f66267"} Dec 01 10:46:00 crc kubenswrapper[4761]: I1201 10:46:00.551661 4761 generic.go:334] "Generic (PLEG): container finished" podID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerID="4b89e21038509d9f6c9aa07f5e7c1ac0af6fc3d2efc1a57641fbb258fd747fdc" exitCode=0 Dec 01 10:46:00 crc kubenswrapper[4761]: I1201 10:46:00.551946 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cs67h" event={"ID":"ad125cc0-4a82-457d-bd99-bf20a288372d","Type":"ContainerDied","Data":"4b89e21038509d9f6c9aa07f5e7c1ac0af6fc3d2efc1a57641fbb258fd747fdc"} Dec 01 10:46:01 crc kubenswrapper[4761]: I1201 10:46:01.554618 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:46:01 crc kubenswrapper[4761]: I1201 10:46:01.554881 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:46:01 crc kubenswrapper[4761]: I1201 10:46:01.566715 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:46:01 crc kubenswrapper[4761]: I1201 10:46:01.566791 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:46:03 crc kubenswrapper[4761]: I1201 10:46:03.574869 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec","Type":"ContainerStarted","Data":"4f1441564040cbecd73e6e6f1d1b54ca71dcaedb0bd78ccd80643395dc5d1b70"} Dec 01 10:46:03 crc kubenswrapper[4761]: I1201 10:46:03.577716 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" event={"ID":"3d9174c7-4f65-40de-941a-4e10bf61eb65","Type":"ContainerStarted","Data":"e1393851c143b809f1edcc774d597693f6939e74fd12531a0323617674654839"} Dec 01 10:46:03 crc kubenswrapper[4761]: I1201 10:46:03.635082 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" podStartSLOduration=2.471092391 podStartE2EDuration="8.63505924s" podCreationTimestamp="2025-12-01 10:45:55 +0000 UTC" firstStartedPulling="2025-12-01 10:45:56.667317674 +0000 UTC m=+895.971076308" lastFinishedPulling="2025-12-01 10:46:02.831284533 +0000 UTC m=+902.135043157" observedRunningTime="2025-12-01 10:46:03.628818293 +0000 UTC m=+902.932576917" watchObservedRunningTime="2025-12-01 10:46:03.63505924 +0000 UTC m=+902.938817864" Dec 01 10:46:04 crc kubenswrapper[4761]: I1201 10:46:04.587004 4761 generic.go:334] "Generic (PLEG): container finished" podID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerID="f69657d67215d66ceae2163fd9ed2037605a7ee0c95be88575e662c63b990596" exitCode=0 Dec 01 10:46:04 crc kubenswrapper[4761]: I1201 10:46:04.587058 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cs67h" event={"ID":"ad125cc0-4a82-457d-bd99-bf20a288372d","Type":"ContainerDied","Data":"f69657d67215d66ceae2163fd9ed2037605a7ee0c95be88575e662c63b990596"} Dec 01 10:46:05 crc kubenswrapper[4761]: I1201 10:46:05.594292 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cs67h" event={"ID":"ad125cc0-4a82-457d-bd99-bf20a288372d","Type":"ContainerStarted","Data":"cc7ab45718aaf09a3398d6dba5d78e63fe3d82a58c1f0816a3c02a7eecb61e99"} Dec 01 10:46:05 crc kubenswrapper[4761]: I1201 10:46:05.616427 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cs67h" podStartSLOduration=5.971614615 podStartE2EDuration="8.616405755s" podCreationTimestamp="2025-12-01 10:45:57 +0000 UTC" firstStartedPulling="2025-12-01 10:46:02.689288358 +0000 UTC m=+901.993046992" lastFinishedPulling="2025-12-01 10:46:05.334079508 +0000 UTC m=+904.637838132" observedRunningTime="2025-12-01 10:46:05.612429419 +0000 UTC m=+904.916188043" watchObservedRunningTime="2025-12-01 10:46:05.616405755 +0000 UTC m=+904.920164389" Dec 01 10:46:06 crc kubenswrapper[4761]: I1201 10:46:06.174791 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:46:06 crc kubenswrapper[4761]: I1201 10:46:06.175118 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:46:06 crc kubenswrapper[4761]: I1201 10:46:06.207610 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:46:06 crc kubenswrapper[4761]: I1201 10:46:06.603158 4761 generic.go:334] "Generic (PLEG): container finished" podID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" containerID="4f1441564040cbecd73e6e6f1d1b54ca71dcaedb0bd78ccd80643395dc5d1b70" exitCode=0 Dec 01 10:46:06 crc kubenswrapper[4761]: I1201 10:46:06.603279 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec","Type":"ContainerDied","Data":"4f1441564040cbecd73e6e6f1d1b54ca71dcaedb0bd78ccd80643395dc5d1b70"} Dec 01 10:46:07 crc kubenswrapper[4761]: I1201 10:46:07.598897 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:46:07 crc kubenswrapper[4761]: I1201 10:46:07.599256 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:46:07 crc kubenswrapper[4761]: I1201 10:46:07.614249 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec","Type":"ContainerStarted","Data":"3fe83aff52a11f78ad3da24bbf5719465d1ea5a39022fe7e63bd7d8a8ea94546"} Dec 01 10:46:07 crc kubenswrapper[4761]: I1201 10:46:07.642254 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=-9223372009.212543 podStartE2EDuration="27.642231997s" podCreationTimestamp="2025-12-01 10:45:40 +0000 UTC" firstStartedPulling="2025-12-01 10:45:42.642940035 +0000 UTC m=+881.946698699" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:46:07.636103414 +0000 UTC m=+906.939862038" watchObservedRunningTime="2025-12-01 10:46:07.642231997 +0000 UTC m=+906.945990641" Dec 01 10:46:07 crc kubenswrapper[4761]: I1201 10:46:07.653171 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:46:11 crc kubenswrapper[4761]: I1201 10:46:11.526043 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:46:11 crc kubenswrapper[4761]: I1201 10:46:11.526436 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.787296 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.789708 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.796399 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.796844 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-5ncpr" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.797089 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.841718 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-kolla-config\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.841758 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28rfk\" (UniqueName: \"kubernetes.io/projected/12b286f6-2061-4845-a2ea-68fb621ff4d0-kube-api-access-28rfk\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.841800 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-config-data\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.943646 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-kolla-config\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.943718 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28rfk\" (UniqueName: \"kubernetes.io/projected/12b286f6-2061-4845-a2ea-68fb621ff4d0-kube-api-access-28rfk\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.943831 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-config-data\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.944506 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-kolla-config\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.945312 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-config-data\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:12 crc kubenswrapper[4761]: I1201 10:46:12.969474 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28rfk\" (UniqueName: \"kubernetes.io/projected/12b286f6-2061-4845-a2ea-68fb621ff4d0-kube-api-access-28rfk\") pod \"memcached-0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:13 crc kubenswrapper[4761]: I1201 10:46:13.111367 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:13 crc kubenswrapper[4761]: I1201 10:46:13.617373 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 01 10:46:13 crc kubenswrapper[4761]: I1201 10:46:13.654011 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"12b286f6-2061-4845-a2ea-68fb621ff4d0","Type":"ContainerStarted","Data":"477c97998d9c32ce2a583e151a7412814bbb8ce7a821eb9de4f1e9a64710c52f"} Dec 01 10:46:13 crc kubenswrapper[4761]: I1201 10:46:13.771002 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:46:13 crc kubenswrapper[4761]: I1201 10:46:13.852485 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:46:14 crc kubenswrapper[4761]: E1201 10:46:14.011306 4761 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.88:52430->38.129.56.88:40281: read tcp 38.129.56.88:52430->38.129.56.88:40281: read: connection reset by peer Dec 01 10:46:16 crc kubenswrapper[4761]: I1201 10:46:16.224791 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:46:17 crc kubenswrapper[4761]: I1201 10:46:17.638473 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:46:17 crc kubenswrapper[4761]: I1201 10:46:17.675039 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"12b286f6-2061-4845-a2ea-68fb621ff4d0","Type":"ContainerStarted","Data":"1e547856630cfabe6fb63d68f08c7e3b67211111602a7caa2d762aa223206c45"} Dec 01 10:46:17 crc kubenswrapper[4761]: I1201 10:46:17.675195 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:17 crc kubenswrapper[4761]: I1201 10:46:17.696988 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=2.362805837 podStartE2EDuration="5.696963462s" podCreationTimestamp="2025-12-01 10:46:12 +0000 UTC" firstStartedPulling="2025-12-01 10:46:13.622380344 +0000 UTC m=+912.926139008" lastFinishedPulling="2025-12-01 10:46:16.956538009 +0000 UTC m=+916.260296633" observedRunningTime="2025-12-01 10:46:17.691808314 +0000 UTC m=+916.995566948" watchObservedRunningTime="2025-12-01 10:46:17.696963462 +0000 UTC m=+917.000722106" Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.252187 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cs67h"] Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.252942 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cs67h" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerName="registry-server" containerID="cri-o://cc7ab45718aaf09a3398d6dba5d78e63fe3d82a58c1f0816a3c02a7eecb61e99" gracePeriod=2 Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.700684 4761 generic.go:334] "Generic (PLEG): container finished" podID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerID="cc7ab45718aaf09a3398d6dba5d78e63fe3d82a58c1f0816a3c02a7eecb61e99" exitCode=0 Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.700750 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cs67h" event={"ID":"ad125cc0-4a82-457d-bd99-bf20a288372d","Type":"ContainerDied","Data":"cc7ab45718aaf09a3398d6dba5d78e63fe3d82a58c1f0816a3c02a7eecb61e99"} Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.701096 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cs67h" event={"ID":"ad125cc0-4a82-457d-bd99-bf20a288372d","Type":"ContainerDied","Data":"b97635ba9454134cb4d88e0b99a2caec6f881e9f3ade28dd983a5558d0f66267"} Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.701112 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b97635ba9454134cb4d88e0b99a2caec6f881e9f3ade28dd983a5558d0f66267" Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.723905 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.776074 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-catalog-content\") pod \"ad125cc0-4a82-457d-bd99-bf20a288372d\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.776141 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-utilities\") pod \"ad125cc0-4a82-457d-bd99-bf20a288372d\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.776227 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfj4g\" (UniqueName: \"kubernetes.io/projected/ad125cc0-4a82-457d-bd99-bf20a288372d-kube-api-access-bfj4g\") pod \"ad125cc0-4a82-457d-bd99-bf20a288372d\" (UID: \"ad125cc0-4a82-457d-bd99-bf20a288372d\") " Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.782411 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-utilities" (OuterVolumeSpecName: "utilities") pod "ad125cc0-4a82-457d-bd99-bf20a288372d" (UID: "ad125cc0-4a82-457d-bd99-bf20a288372d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.786713 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad125cc0-4a82-457d-bd99-bf20a288372d-kube-api-access-bfj4g" (OuterVolumeSpecName: "kube-api-access-bfj4g") pod "ad125cc0-4a82-457d-bd99-bf20a288372d" (UID: "ad125cc0-4a82-457d-bd99-bf20a288372d"). InnerVolumeSpecName "kube-api-access-bfj4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.825785 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad125cc0-4a82-457d-bd99-bf20a288372d" (UID: "ad125cc0-4a82-457d-bd99-bf20a288372d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.877597 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.877685 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad125cc0-4a82-457d-bd99-bf20a288372d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:21 crc kubenswrapper[4761]: I1201 10:46:21.877698 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfj4g\" (UniqueName: \"kubernetes.io/projected/ad125cc0-4a82-457d-bd99-bf20a288372d-kube-api-access-bfj4g\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:22 crc kubenswrapper[4761]: I1201 10:46:22.706783 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cs67h" Dec 01 10:46:22 crc kubenswrapper[4761]: I1201 10:46:22.732338 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cs67h"] Dec 01 10:46:22 crc kubenswrapper[4761]: I1201 10:46:22.736709 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cs67h"] Dec 01 10:46:23 crc kubenswrapper[4761]: I1201 10:46:23.112787 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Dec 01 10:46:23 crc kubenswrapper[4761]: I1201 10:46:23.139893 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" path="/var/lib/kubelet/pods/ad125cc0-4a82-457d-bd99-bf20a288372d/volumes" Dec 01 10:46:25 crc kubenswrapper[4761]: I1201 10:46:25.499913 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:46:25 crc kubenswrapper[4761]: I1201 10:46:25.577280 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.303824 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh"] Dec 01 10:46:26 crc kubenswrapper[4761]: E1201 10:46:26.304102 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerName="extract-content" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.304119 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerName="extract-content" Dec 01 10:46:26 crc kubenswrapper[4761]: E1201 10:46:26.304140 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerName="extract-utilities" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.304148 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerName="extract-utilities" Dec 01 10:46:26 crc kubenswrapper[4761]: E1201 10:46:26.304165 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerName="registry-server" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.304172 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerName="registry-server" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.304307 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad125cc0-4a82-457d-bd99-bf20a288372d" containerName="registry-server" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.305339 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.307092 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8w9gk" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.321244 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh"] Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.475280 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.475333 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.475421 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrl4\" (UniqueName: \"kubernetes.io/projected/dac62f65-15c5-4ba6-881b-0af0ccf341b6-kube-api-access-pmrl4\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.576829 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.576989 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrl4\" (UniqueName: \"kubernetes.io/projected/dac62f65-15c5-4ba6-881b-0af0ccf341b6-kube-api-access-pmrl4\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.577173 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.577377 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.577871 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.604995 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrl4\" (UniqueName: \"kubernetes.io/projected/dac62f65-15c5-4ba6-881b-0af0ccf341b6-kube-api-access-pmrl4\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.629808 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:26 crc kubenswrapper[4761]: I1201 10:46:26.980769 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh"] Dec 01 10:46:27 crc kubenswrapper[4761]: I1201 10:46:27.740730 4761 generic.go:334] "Generic (PLEG): container finished" podID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerID="dd9570c7685191bb06ff2423acc8abe2c844a69c78ce987216c76f7dbd9e047e" exitCode=0 Dec 01 10:46:27 crc kubenswrapper[4761]: I1201 10:46:27.740791 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" event={"ID":"dac62f65-15c5-4ba6-881b-0af0ccf341b6","Type":"ContainerDied","Data":"dd9570c7685191bb06ff2423acc8abe2c844a69c78ce987216c76f7dbd9e047e"} Dec 01 10:46:27 crc kubenswrapper[4761]: I1201 10:46:27.741049 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" event={"ID":"dac62f65-15c5-4ba6-881b-0af0ccf341b6","Type":"ContainerStarted","Data":"e3e8a605d9bc740848690e76045fdab3aa2f8e19f47826e0fb4d436cb7373393"} Dec 01 10:46:28 crc kubenswrapper[4761]: I1201 10:46:28.228734 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:46:28 crc kubenswrapper[4761]: I1201 10:46:28.296122 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:46:29 crc kubenswrapper[4761]: I1201 10:46:29.759355 4761 generic.go:334] "Generic (PLEG): container finished" podID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerID="cc47b488b9d91db956df0c9f77285bdd2e4fa3ee82efad0b5fcdd75d5c1dfb85" exitCode=0 Dec 01 10:46:29 crc kubenswrapper[4761]: I1201 10:46:29.759448 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" event={"ID":"dac62f65-15c5-4ba6-881b-0af0ccf341b6","Type":"ContainerDied","Data":"cc47b488b9d91db956df0c9f77285bdd2e4fa3ee82efad0b5fcdd75d5c1dfb85"} Dec 01 10:46:30 crc kubenswrapper[4761]: I1201 10:46:30.770403 4761 generic.go:334] "Generic (PLEG): container finished" podID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerID="633a7783582d09f72ccc52631860fc64bdc3acb7073a38836ff5ffe307c570ab" exitCode=0 Dec 01 10:46:30 crc kubenswrapper[4761]: I1201 10:46:30.770692 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" event={"ID":"dac62f65-15c5-4ba6-881b-0af0ccf341b6","Type":"ContainerDied","Data":"633a7783582d09f72ccc52631860fc64bdc3acb7073a38836ff5ffe307c570ab"} Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.154194 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.255745 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-util\") pod \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.255917 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmrl4\" (UniqueName: \"kubernetes.io/projected/dac62f65-15c5-4ba6-881b-0af0ccf341b6-kube-api-access-pmrl4\") pod \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.255998 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-bundle\") pod \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\" (UID: \"dac62f65-15c5-4ba6-881b-0af0ccf341b6\") " Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.256568 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-bundle" (OuterVolumeSpecName: "bundle") pod "dac62f65-15c5-4ba6-881b-0af0ccf341b6" (UID: "dac62f65-15c5-4ba6-881b-0af0ccf341b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.257800 4761 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.268325 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac62f65-15c5-4ba6-881b-0af0ccf341b6-kube-api-access-pmrl4" (OuterVolumeSpecName: "kube-api-access-pmrl4") pod "dac62f65-15c5-4ba6-881b-0af0ccf341b6" (UID: "dac62f65-15c5-4ba6-881b-0af0ccf341b6"). InnerVolumeSpecName "kube-api-access-pmrl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.277593 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-util" (OuterVolumeSpecName: "util") pod "dac62f65-15c5-4ba6-881b-0af0ccf341b6" (UID: "dac62f65-15c5-4ba6-881b-0af0ccf341b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.359003 4761 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac62f65-15c5-4ba6-881b-0af0ccf341b6-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.359041 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmrl4\" (UniqueName: \"kubernetes.io/projected/dac62f65-15c5-4ba6-881b-0af0ccf341b6-kube-api-access-pmrl4\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.785063 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" event={"ID":"dac62f65-15c5-4ba6-881b-0af0ccf341b6","Type":"ContainerDied","Data":"e3e8a605d9bc740848690e76045fdab3aa2f8e19f47826e0fb4d436cb7373393"} Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.785107 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh" Dec 01 10:46:32 crc kubenswrapper[4761]: I1201 10:46:32.785116 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3e8a605d9bc740848690e76045fdab3aa2f8e19f47826e0fb4d436cb7373393" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.062205 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq"] Dec 01 10:46:44 crc kubenswrapper[4761]: E1201 10:46:44.063175 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerName="util" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.063189 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerName="util" Dec 01 10:46:44 crc kubenswrapper[4761]: E1201 10:46:44.063205 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerName="pull" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.063213 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerName="pull" Dec 01 10:46:44 crc kubenswrapper[4761]: E1201 10:46:44.063231 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerName="extract" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.063238 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerName="extract" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.063343 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" containerName="extract" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.063851 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.066269 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-n69z9" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.081653 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq"] Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.126571 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6bd7\" (UniqueName: \"kubernetes.io/projected/f9355b38-86ff-42a4-80ea-c34b540953df-kube-api-access-f6bd7\") pod \"rabbitmq-cluster-operator-779fc9694b-plxnq\" (UID: \"f9355b38-86ff-42a4-80ea-c34b540953df\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.227626 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6bd7\" (UniqueName: \"kubernetes.io/projected/f9355b38-86ff-42a4-80ea-c34b540953df-kube-api-access-f6bd7\") pod \"rabbitmq-cluster-operator-779fc9694b-plxnq\" (UID: \"f9355b38-86ff-42a4-80ea-c34b540953df\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.246336 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6bd7\" (UniqueName: \"kubernetes.io/projected/f9355b38-86ff-42a4-80ea-c34b540953df-kube-api-access-f6bd7\") pod \"rabbitmq-cluster-operator-779fc9694b-plxnq\" (UID: \"f9355b38-86ff-42a4-80ea-c34b540953df\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.385715 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.788343 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq"] Dec 01 10:46:44 crc kubenswrapper[4761]: I1201 10:46:44.869898 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" event={"ID":"f9355b38-86ff-42a4-80ea-c34b540953df","Type":"ContainerStarted","Data":"de507c70c37e0cdba746b0533b65e42d77e188123f664adfb5f94a8a6c813335"} Dec 01 10:46:49 crc kubenswrapper[4761]: I1201 10:46:49.913403 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" event={"ID":"f9355b38-86ff-42a4-80ea-c34b540953df","Type":"ContainerStarted","Data":"c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57"} Dec 01 10:46:49 crc kubenswrapper[4761]: I1201 10:46:49.935649 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" podStartSLOduration=1.437692548 podStartE2EDuration="5.935620573s" podCreationTimestamp="2025-12-01 10:46:44 +0000 UTC" firstStartedPulling="2025-12-01 10:46:44.796652313 +0000 UTC m=+944.100410937" lastFinishedPulling="2025-12-01 10:46:49.294580338 +0000 UTC m=+948.598338962" observedRunningTime="2025-12-01 10:46:49.92765289 +0000 UTC m=+949.231411514" watchObservedRunningTime="2025-12-01 10:46:49.935620573 +0000 UTC m=+949.239379227" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.405257 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.406784 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: W1201 10:46:55.410665 4761 reflector.go:561] object-"glance-kuttl-tests"/"rabbitmq-plugins-conf": failed to list *v1.ConfigMap: configmaps "rabbitmq-plugins-conf" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Dec 01 10:46:55 crc kubenswrapper[4761]: W1201 10:46:55.410686 4761 reflector.go:561] object-"glance-kuttl-tests"/"rabbitmq-server-conf": failed to list *v1.ConfigMap: configmaps "rabbitmq-server-conf" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Dec 01 10:46:55 crc kubenswrapper[4761]: W1201 10:46:55.410694 4761 reflector.go:561] object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie": failed to list *v1.Secret: secrets "rabbitmq-erlang-cookie" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Dec 01 10:46:55 crc kubenswrapper[4761]: E1201 10:46:55.410713 4761 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"rabbitmq-plugins-conf\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"rabbitmq-plugins-conf\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 10:46:55 crc kubenswrapper[4761]: E1201 10:46:55.410723 4761 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"rabbitmq-server-conf\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"rabbitmq-server-conf\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 10:46:55 crc kubenswrapper[4761]: W1201 10:46:55.410738 4761 reflector.go:561] object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-hjck5": failed to list *v1.Secret: secrets "rabbitmq-server-dockercfg-hjck5" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Dec 01 10:46:55 crc kubenswrapper[4761]: E1201 10:46:55.410749 4761 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"rabbitmq-erlang-cookie\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"rabbitmq-erlang-cookie\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 10:46:55 crc kubenswrapper[4761]: W1201 10:46:55.410670 4761 reflector.go:561] object-"glance-kuttl-tests"/"rabbitmq-default-user": failed to list *v1.Secret: secrets "rabbitmq-default-user" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Dec 01 10:46:55 crc kubenswrapper[4761]: E1201 10:46:55.410780 4761 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"rabbitmq-default-user\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"rabbitmq-default-user\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 10:46:55 crc kubenswrapper[4761]: E1201 10:46:55.410799 4761 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"rabbitmq-server-dockercfg-hjck5\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"rabbitmq-server-dockercfg-hjck5\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.424637 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.471987 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.472021 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.472041 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.472061 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0ea9de0e-511f-47f6-92f5-30756585a438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.472101 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.472131 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e07e5919-c158-40b5-a20d-6c07c7f98ecd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.472164 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.472185 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjtsx\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-kube-api-access-vjtsx\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.573826 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.573896 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e07e5919-c158-40b5-a20d-6c07c7f98ecd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.573943 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.573967 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjtsx\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-kube-api-access-vjtsx\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.573998 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.574014 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.574030 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.574053 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0ea9de0e-511f-47f6-92f5-30756585a438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.575052 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.575119 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.580628 4761 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.580676 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0ea9de0e-511f-47f6-92f5-30756585a438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/031f9c817c663e770f91f483bacb702c6b0e4a76569fb2c43352bb984561cabf/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.582161 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e07e5919-c158-40b5-a20d-6c07c7f98ecd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.592237 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjtsx\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-kube-api-access-vjtsx\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:55 crc kubenswrapper[4761]: I1201 10:46:55.604265 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0ea9de0e-511f-47f6-92f5-30756585a438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:56 crc kubenswrapper[4761]: I1201 10:46:56.435783 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-hjck5" Dec 01 10:46:56 crc kubenswrapper[4761]: I1201 10:46:56.472109 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Dec 01 10:46:56 crc kubenswrapper[4761]: E1201 10:46:56.574823 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/rabbitmq-erlang-cookie: failed to sync secret cache: timed out waiting for the condition Dec 01 10:46:56 crc kubenswrapper[4761]: E1201 10:46:56.575094 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret podName:e07e5919-c158-40b5-a20d-6c07c7f98ecd nodeName:}" failed. No retries permitted until 2025-12-01 10:46:57.075072964 +0000 UTC m=+956.378831588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "erlang-cookie-secret" (UniqueName: "kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret") pod "rabbitmq-server-0" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd") : failed to sync secret cache: timed out waiting for the condition Dec 01 10:46:56 crc kubenswrapper[4761]: E1201 10:46:56.574898 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/rabbitmq-plugins-conf: failed to sync configmap cache: timed out waiting for the condition Dec 01 10:46:56 crc kubenswrapper[4761]: E1201 10:46:56.575324 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf podName:e07e5919-c158-40b5-a20d-6c07c7f98ecd nodeName:}" failed. No retries permitted until 2025-12-01 10:46:57.07531617 +0000 UTC m=+956.379074784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugins-conf" (UniqueName: "kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf") pod "rabbitmq-server-0" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd") : failed to sync configmap cache: timed out waiting for the condition Dec 01 10:46:56 crc kubenswrapper[4761]: I1201 10:46:56.652777 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Dec 01 10:46:56 crc kubenswrapper[4761]: I1201 10:46:56.664658 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:56 crc kubenswrapper[4761]: I1201 10:46:56.900245 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Dec 01 10:46:56 crc kubenswrapper[4761]: I1201 10:46:56.917050 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.067843 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-txt4b"] Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.069023 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.072228 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-kx2bz" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.083336 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-txt4b"] Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.097645 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.097832 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.099596 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.105241 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.199190 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ttfv\" (UniqueName: \"kubernetes.io/projected/6c0796c3-509a-4117-8973-0a740ba1dc2f-kube-api-access-9ttfv\") pod \"keystone-operator-index-txt4b\" (UID: \"6c0796c3-509a-4117-8973-0a740ba1dc2f\") " pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.224155 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.301288 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ttfv\" (UniqueName: \"kubernetes.io/projected/6c0796c3-509a-4117-8973-0a740ba1dc2f-kube-api-access-9ttfv\") pod \"keystone-operator-index-txt4b\" (UID: \"6c0796c3-509a-4117-8973-0a740ba1dc2f\") " pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.332386 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ttfv\" (UniqueName: \"kubernetes.io/projected/6c0796c3-509a-4117-8973-0a740ba1dc2f-kube-api-access-9ttfv\") pod \"keystone-operator-index-txt4b\" (UID: \"6c0796c3-509a-4117-8973-0a740ba1dc2f\") " pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.402136 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.723652 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 01 10:46:57 crc kubenswrapper[4761]: W1201 10:46:57.816838 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c0796c3_509a_4117_8973_0a740ba1dc2f.slice/crio-e4a087d67363101254f06fb716372580355d038303ce841b399351f28f6cd648 WatchSource:0}: Error finding container e4a087d67363101254f06fb716372580355d038303ce841b399351f28f6cd648: Status 404 returned error can't find the container with id e4a087d67363101254f06fb716372580355d038303ce841b399351f28f6cd648 Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.818476 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-txt4b"] Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.975125 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e07e5919-c158-40b5-a20d-6c07c7f98ecd","Type":"ContainerStarted","Data":"26b4b258e289dbd521be1c5c5dcfb22a781a54bfed246cf937793889e9b91dcb"} Dec 01 10:46:57 crc kubenswrapper[4761]: I1201 10:46:57.976707 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-txt4b" event={"ID":"6c0796c3-509a-4117-8973-0a740ba1dc2f","Type":"ContainerStarted","Data":"e4a087d67363101254f06fb716372580355d038303ce841b399351f28f6cd648"} Dec 01 10:46:59 crc kubenswrapper[4761]: I1201 10:46:59.992242 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-txt4b" event={"ID":"6c0796c3-509a-4117-8973-0a740ba1dc2f","Type":"ContainerStarted","Data":"93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303"} Dec 01 10:47:00 crc kubenswrapper[4761]: I1201 10:47:00.008685 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-txt4b" podStartSLOduration=1.598471361 podStartE2EDuration="3.008666588s" podCreationTimestamp="2025-12-01 10:46:57 +0000 UTC" firstStartedPulling="2025-12-01 10:46:57.819716895 +0000 UTC m=+957.123475519" lastFinishedPulling="2025-12-01 10:46:59.229912122 +0000 UTC m=+958.533670746" observedRunningTime="2025-12-01 10:47:00.006062438 +0000 UTC m=+959.309821062" watchObservedRunningTime="2025-12-01 10:47:00.008666588 +0000 UTC m=+959.312425212" Dec 01 10:47:07 crc kubenswrapper[4761]: I1201 10:47:07.050513 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e07e5919-c158-40b5-a20d-6c07c7f98ecd","Type":"ContainerStarted","Data":"922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6"} Dec 01 10:47:07 crc kubenswrapper[4761]: I1201 10:47:07.402263 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:47:07 crc kubenswrapper[4761]: I1201 10:47:07.402327 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:47:07 crc kubenswrapper[4761]: I1201 10:47:07.471074 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:47:08 crc kubenswrapper[4761]: I1201 10:47:08.102188 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.306960 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5"] Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.309946 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.313702 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8w9gk" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.322319 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5"] Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.433502 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnk29\" (UniqueName: \"kubernetes.io/projected/33f85f97-cf13-45ad-9b74-50272f00a8be-kube-api-access-nnk29\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.433655 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-bundle\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.433722 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-util\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.535306 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnk29\" (UniqueName: \"kubernetes.io/projected/33f85f97-cf13-45ad-9b74-50272f00a8be-kube-api-access-nnk29\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.535667 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-bundle\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.535764 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-util\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.536219 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-bundle\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.536459 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-util\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.557010 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnk29\" (UniqueName: \"kubernetes.io/projected/33f85f97-cf13-45ad-9b74-50272f00a8be-kube-api-access-nnk29\") pod \"49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.630602 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:09 crc kubenswrapper[4761]: I1201 10:47:09.912972 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5"] Dec 01 10:47:09 crc kubenswrapper[4761]: W1201 10:47:09.921442 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f85f97_cf13_45ad_9b74_50272f00a8be.slice/crio-f70547fdbf7f57559d2f3f83bc3159826cc2e6de30e0258343a2e9125caaa6e6 WatchSource:0}: Error finding container f70547fdbf7f57559d2f3f83bc3159826cc2e6de30e0258343a2e9125caaa6e6: Status 404 returned error can't find the container with id f70547fdbf7f57559d2f3f83bc3159826cc2e6de30e0258343a2e9125caaa6e6 Dec 01 10:47:10 crc kubenswrapper[4761]: I1201 10:47:10.076234 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" event={"ID":"33f85f97-cf13-45ad-9b74-50272f00a8be","Type":"ContainerStarted","Data":"f70547fdbf7f57559d2f3f83bc3159826cc2e6de30e0258343a2e9125caaa6e6"} Dec 01 10:47:11 crc kubenswrapper[4761]: I1201 10:47:11.086720 4761 generic.go:334] "Generic (PLEG): container finished" podID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerID="1107dd2bfb9ba0fe64f35e4825249099b9e1e6e8c05193d90a9876468d91c4b3" exitCode=0 Dec 01 10:47:11 crc kubenswrapper[4761]: I1201 10:47:11.086776 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" event={"ID":"33f85f97-cf13-45ad-9b74-50272f00a8be","Type":"ContainerDied","Data":"1107dd2bfb9ba0fe64f35e4825249099b9e1e6e8c05193d90a9876468d91c4b3"} Dec 01 10:47:12 crc kubenswrapper[4761]: I1201 10:47:12.095542 4761 generic.go:334] "Generic (PLEG): container finished" podID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerID="f5ec29afb6dd6fe8f692c8ed2cfce9766e84d8e73ddda4deceb3b5762919db99" exitCode=0 Dec 01 10:47:12 crc kubenswrapper[4761]: I1201 10:47:12.095633 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" event={"ID":"33f85f97-cf13-45ad-9b74-50272f00a8be","Type":"ContainerDied","Data":"f5ec29afb6dd6fe8f692c8ed2cfce9766e84d8e73ddda4deceb3b5762919db99"} Dec 01 10:47:13 crc kubenswrapper[4761]: I1201 10:47:13.106099 4761 generic.go:334] "Generic (PLEG): container finished" podID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerID="dca4bb5c40a5e876c00219be44d21a85c9667bb8bafa04d62073cd32e7c9a895" exitCode=0 Dec 01 10:47:13 crc kubenswrapper[4761]: I1201 10:47:13.106183 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" event={"ID":"33f85f97-cf13-45ad-9b74-50272f00a8be","Type":"ContainerDied","Data":"dca4bb5c40a5e876c00219be44d21a85c9667bb8bafa04d62073cd32e7c9a895"} Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.445315 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.612645 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-bundle\") pod \"33f85f97-cf13-45ad-9b74-50272f00a8be\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.612975 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-util\") pod \"33f85f97-cf13-45ad-9b74-50272f00a8be\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.613027 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnk29\" (UniqueName: \"kubernetes.io/projected/33f85f97-cf13-45ad-9b74-50272f00a8be-kube-api-access-nnk29\") pod \"33f85f97-cf13-45ad-9b74-50272f00a8be\" (UID: \"33f85f97-cf13-45ad-9b74-50272f00a8be\") " Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.614332 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-bundle" (OuterVolumeSpecName: "bundle") pod "33f85f97-cf13-45ad-9b74-50272f00a8be" (UID: "33f85f97-cf13-45ad-9b74-50272f00a8be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.621114 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f85f97-cf13-45ad-9b74-50272f00a8be-kube-api-access-nnk29" (OuterVolumeSpecName: "kube-api-access-nnk29") pod "33f85f97-cf13-45ad-9b74-50272f00a8be" (UID: "33f85f97-cf13-45ad-9b74-50272f00a8be"). InnerVolumeSpecName "kube-api-access-nnk29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.642819 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-util" (OuterVolumeSpecName: "util") pod "33f85f97-cf13-45ad-9b74-50272f00a8be" (UID: "33f85f97-cf13-45ad-9b74-50272f00a8be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.715113 4761 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.715153 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnk29\" (UniqueName: \"kubernetes.io/projected/33f85f97-cf13-45ad-9b74-50272f00a8be-kube-api-access-nnk29\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:14 crc kubenswrapper[4761]: I1201 10:47:14.715165 4761 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33f85f97-cf13-45ad-9b74-50272f00a8be-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:15 crc kubenswrapper[4761]: I1201 10:47:15.121726 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" event={"ID":"33f85f97-cf13-45ad-9b74-50272f00a8be","Type":"ContainerDied","Data":"f70547fdbf7f57559d2f3f83bc3159826cc2e6de30e0258343a2e9125caaa6e6"} Dec 01 10:47:15 crc kubenswrapper[4761]: I1201 10:47:15.121765 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f70547fdbf7f57559d2f3f83bc3159826cc2e6de30e0258343a2e9125caaa6e6" Dec 01 10:47:15 crc kubenswrapper[4761]: I1201 10:47:15.121777 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.062330 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5"] Dec 01 10:47:20 crc kubenswrapper[4761]: E1201 10:47:20.063145 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerName="extract" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.063172 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerName="extract" Dec 01 10:47:20 crc kubenswrapper[4761]: E1201 10:47:20.063186 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerName="pull" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.063195 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerName="pull" Dec 01 10:47:20 crc kubenswrapper[4761]: E1201 10:47:20.063214 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerName="util" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.063221 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerName="util" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.063347 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f85f97-cf13-45ad-9b74-50272f00a8be" containerName="extract" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.063866 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: W1201 10:47:20.065583 4761 reflector.go:561] object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hg2mt": failed to list *v1.Secret: secrets "keystone-operator-controller-manager-dockercfg-hg2mt" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Dec 01 10:47:20 crc kubenswrapper[4761]: E1201 10:47:20.065625 4761 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"keystone-operator-controller-manager-dockercfg-hg2mt\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-operator-controller-manager-dockercfg-hg2mt\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.069469 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.117369 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5"] Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.190808 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-apiservice-cert\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.190882 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz44k\" (UniqueName: \"kubernetes.io/projected/28ed4847-2217-4e5d-8d1b-7006e6098116-kube-api-access-qz44k\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.190987 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-webhook-cert\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.292087 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz44k\" (UniqueName: \"kubernetes.io/projected/28ed4847-2217-4e5d-8d1b-7006e6098116-kube-api-access-qz44k\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.292175 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-webhook-cert\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.292216 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-apiservice-cert\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.298182 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-apiservice-cert\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.299186 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-webhook-cert\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:20 crc kubenswrapper[4761]: I1201 10:47:20.308006 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz44k\" (UniqueName: \"kubernetes.io/projected/28ed4847-2217-4e5d-8d1b-7006e6098116-kube-api-access-qz44k\") pod \"keystone-operator-controller-manager-86757b45cc-8hmf5\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:21 crc kubenswrapper[4761]: I1201 10:47:21.210278 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hg2mt" Dec 01 10:47:21 crc kubenswrapper[4761]: I1201 10:47:21.219621 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:21 crc kubenswrapper[4761]: I1201 10:47:21.477190 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5"] Dec 01 10:47:21 crc kubenswrapper[4761]: W1201 10:47:21.491748 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ed4847_2217_4e5d_8d1b_7006e6098116.slice/crio-38ce51711ed420ea38464fd7d11c752ad63986bbff365c4f09f51405785a859f WatchSource:0}: Error finding container 38ce51711ed420ea38464fd7d11c752ad63986bbff365c4f09f51405785a859f: Status 404 returned error can't find the container with id 38ce51711ed420ea38464fd7d11c752ad63986bbff365c4f09f51405785a859f Dec 01 10:47:22 crc kubenswrapper[4761]: I1201 10:47:22.179869 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" event={"ID":"28ed4847-2217-4e5d-8d1b-7006e6098116","Type":"ContainerStarted","Data":"38ce51711ed420ea38464fd7d11c752ad63986bbff365c4f09f51405785a859f"} Dec 01 10:47:26 crc kubenswrapper[4761]: I1201 10:47:26.208784 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" event={"ID":"28ed4847-2217-4e5d-8d1b-7006e6098116","Type":"ContainerStarted","Data":"ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b"} Dec 01 10:47:26 crc kubenswrapper[4761]: I1201 10:47:26.210369 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:26 crc kubenswrapper[4761]: I1201 10:47:26.232359 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" podStartSLOduration=1.806893799 podStartE2EDuration="6.232337481s" podCreationTimestamp="2025-12-01 10:47:20 +0000 UTC" firstStartedPulling="2025-12-01 10:47:21.494213263 +0000 UTC m=+980.797971887" lastFinishedPulling="2025-12-01 10:47:25.919656945 +0000 UTC m=+985.223415569" observedRunningTime="2025-12-01 10:47:26.228099127 +0000 UTC m=+985.531857751" watchObservedRunningTime="2025-12-01 10:47:26.232337481 +0000 UTC m=+985.536096095" Dec 01 10:47:31 crc kubenswrapper[4761]: I1201 10:47:31.225760 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:47:33 crc kubenswrapper[4761]: I1201 10:47:33.850618 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:47:33 crc kubenswrapper[4761]: I1201 10:47:33.850916 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.602574 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-rstz4"] Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.603839 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.610717 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-529d-account-create-update-ng6qb"] Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.611855 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.614473 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.621114 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-rstz4"] Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.631151 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-529d-account-create-update-ng6qb"] Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.699514 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ed070d-7777-4cf1-b0c7-007059a78cc3-operator-scripts\") pod \"keystone-529d-account-create-update-ng6qb\" (UID: \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\") " pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.699567 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88r67\" (UniqueName: \"kubernetes.io/projected/e8ed070d-7777-4cf1-b0c7-007059a78cc3-kube-api-access-88r67\") pod \"keystone-529d-account-create-update-ng6qb\" (UID: \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\") " pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.699725 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7cb9538-4868-41e0-b3b8-67c31f777482-operator-scripts\") pod \"keystone-db-create-rstz4\" (UID: \"e7cb9538-4868-41e0-b3b8-67c31f777482\") " pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.699798 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgt8w\" (UniqueName: \"kubernetes.io/projected/e7cb9538-4868-41e0-b3b8-67c31f777482-kube-api-access-fgt8w\") pod \"keystone-db-create-rstz4\" (UID: \"e7cb9538-4868-41e0-b3b8-67c31f777482\") " pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.801490 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88r67\" (UniqueName: \"kubernetes.io/projected/e8ed070d-7777-4cf1-b0c7-007059a78cc3-kube-api-access-88r67\") pod \"keystone-529d-account-create-update-ng6qb\" (UID: \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\") " pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.801611 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7cb9538-4868-41e0-b3b8-67c31f777482-operator-scripts\") pod \"keystone-db-create-rstz4\" (UID: \"e7cb9538-4868-41e0-b3b8-67c31f777482\") " pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.801644 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgt8w\" (UniqueName: \"kubernetes.io/projected/e7cb9538-4868-41e0-b3b8-67c31f777482-kube-api-access-fgt8w\") pod \"keystone-db-create-rstz4\" (UID: \"e7cb9538-4868-41e0-b3b8-67c31f777482\") " pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.801701 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ed070d-7777-4cf1-b0c7-007059a78cc3-operator-scripts\") pod \"keystone-529d-account-create-update-ng6qb\" (UID: \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\") " pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.802431 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ed070d-7777-4cf1-b0c7-007059a78cc3-operator-scripts\") pod \"keystone-529d-account-create-update-ng6qb\" (UID: \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\") " pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.802845 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7cb9538-4868-41e0-b3b8-67c31f777482-operator-scripts\") pod \"keystone-db-create-rstz4\" (UID: \"e7cb9538-4868-41e0-b3b8-67c31f777482\") " pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.822729 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgt8w\" (UniqueName: \"kubernetes.io/projected/e7cb9538-4868-41e0-b3b8-67c31f777482-kube-api-access-fgt8w\") pod \"keystone-db-create-rstz4\" (UID: \"e7cb9538-4868-41e0-b3b8-67c31f777482\") " pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.832231 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88r67\" (UniqueName: \"kubernetes.io/projected/e8ed070d-7777-4cf1-b0c7-007059a78cc3-kube-api-access-88r67\") pod \"keystone-529d-account-create-update-ng6qb\" (UID: \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\") " pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.943829 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:34 crc kubenswrapper[4761]: I1201 10:47:34.955359 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:35 crc kubenswrapper[4761]: I1201 10:47:35.965226 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-rstz4"] Dec 01 10:47:36 crc kubenswrapper[4761]: I1201 10:47:36.185063 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-529d-account-create-update-ng6qb"] Dec 01 10:47:36 crc kubenswrapper[4761]: W1201 10:47:36.191342 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8ed070d_7777_4cf1_b0c7_007059a78cc3.slice/crio-459f82e509b59ce6d901368af150bdb026c6cf99b76adc1ef6148db7a9e4dd7a WatchSource:0}: Error finding container 459f82e509b59ce6d901368af150bdb026c6cf99b76adc1ef6148db7a9e4dd7a: Status 404 returned error can't find the container with id 459f82e509b59ce6d901368af150bdb026c6cf99b76adc1ef6148db7a9e4dd7a Dec 01 10:47:36 crc kubenswrapper[4761]: I1201 10:47:36.273561 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-rstz4" event={"ID":"e7cb9538-4868-41e0-b3b8-67c31f777482","Type":"ContainerStarted","Data":"33afbbd18a2056b8f427cb5fbddf15f01c522bce670fd708cca0f8529787a3ae"} Dec 01 10:47:36 crc kubenswrapper[4761]: I1201 10:47:36.273921 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-rstz4" event={"ID":"e7cb9538-4868-41e0-b3b8-67c31f777482","Type":"ContainerStarted","Data":"acabb9d6da20c0c84b5110cc784d40976805721e9f3313694ba8355ab13bcfdf"} Dec 01 10:47:36 crc kubenswrapper[4761]: I1201 10:47:36.275224 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" event={"ID":"e8ed070d-7777-4cf1-b0c7-007059a78cc3","Type":"ContainerStarted","Data":"459f82e509b59ce6d901368af150bdb026c6cf99b76adc1ef6148db7a9e4dd7a"} Dec 01 10:47:36 crc kubenswrapper[4761]: I1201 10:47:36.292362 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-create-rstz4" podStartSLOduration=2.292343159 podStartE2EDuration="2.292343159s" podCreationTimestamp="2025-12-01 10:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:47:36.286593994 +0000 UTC m=+995.590352638" watchObservedRunningTime="2025-12-01 10:47:36.292343159 +0000 UTC m=+995.596101783" Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.271034 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-v8vkt"] Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.272376 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.275493 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-4vfvg" Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.285698 4761 generic.go:334] "Generic (PLEG): container finished" podID="e7cb9538-4868-41e0-b3b8-67c31f777482" containerID="33afbbd18a2056b8f427cb5fbddf15f01c522bce670fd708cca0f8529787a3ae" exitCode=0 Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.285802 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-rstz4" event={"ID":"e7cb9538-4868-41e0-b3b8-67c31f777482","Type":"ContainerDied","Data":"33afbbd18a2056b8f427cb5fbddf15f01c522bce670fd708cca0f8529787a3ae"} Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.288043 4761 generic.go:334] "Generic (PLEG): container finished" podID="e8ed070d-7777-4cf1-b0c7-007059a78cc3" containerID="69eb9966894da3611f83c4534d92de43cf90aa0ad307fa6a9cc8088c4ad4e0de" exitCode=0 Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.288238 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" event={"ID":"e8ed070d-7777-4cf1-b0c7-007059a78cc3","Type":"ContainerDied","Data":"69eb9966894da3611f83c4534d92de43cf90aa0ad307fa6a9cc8088c4ad4e0de"} Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.289008 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-v8vkt"] Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.343241 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dmrl\" (UniqueName: \"kubernetes.io/projected/f837fbca-013a-4054-af8d-fcf798f568d0-kube-api-access-2dmrl\") pod \"horizon-operator-index-v8vkt\" (UID: \"f837fbca-013a-4054-af8d-fcf798f568d0\") " pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.444231 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dmrl\" (UniqueName: \"kubernetes.io/projected/f837fbca-013a-4054-af8d-fcf798f568d0-kube-api-access-2dmrl\") pod \"horizon-operator-index-v8vkt\" (UID: \"f837fbca-013a-4054-af8d-fcf798f568d0\") " pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.467446 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dmrl\" (UniqueName: \"kubernetes.io/projected/f837fbca-013a-4054-af8d-fcf798f568d0-kube-api-access-2dmrl\") pod \"horizon-operator-index-v8vkt\" (UID: \"f837fbca-013a-4054-af8d-fcf798f568d0\") " pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:37 crc kubenswrapper[4761]: I1201 10:47:37.629273 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.083037 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-v8vkt"] Dec 01 10:47:38 crc kubenswrapper[4761]: W1201 10:47:38.091707 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf837fbca_013a_4054_af8d_fcf798f568d0.slice/crio-b566b4351a34057f717b82f4a71d0c6918a4dfddcd01d947e8b975ef5b33b55e WatchSource:0}: Error finding container b566b4351a34057f717b82f4a71d0c6918a4dfddcd01d947e8b975ef5b33b55e: Status 404 returned error can't find the container with id b566b4351a34057f717b82f4a71d0c6918a4dfddcd01d947e8b975ef5b33b55e Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.298286 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-v8vkt" event={"ID":"f837fbca-013a-4054-af8d-fcf798f568d0","Type":"ContainerStarted","Data":"b566b4351a34057f717b82f4a71d0c6918a4dfddcd01d947e8b975ef5b33b55e"} Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.717929 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.731908 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.760452 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88r67\" (UniqueName: \"kubernetes.io/projected/e8ed070d-7777-4cf1-b0c7-007059a78cc3-kube-api-access-88r67\") pod \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\" (UID: \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\") " Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.760529 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgt8w\" (UniqueName: \"kubernetes.io/projected/e7cb9538-4868-41e0-b3b8-67c31f777482-kube-api-access-fgt8w\") pod \"e7cb9538-4868-41e0-b3b8-67c31f777482\" (UID: \"e7cb9538-4868-41e0-b3b8-67c31f777482\") " Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.760661 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ed070d-7777-4cf1-b0c7-007059a78cc3-operator-scripts\") pod \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\" (UID: \"e8ed070d-7777-4cf1-b0c7-007059a78cc3\") " Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.760775 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7cb9538-4868-41e0-b3b8-67c31f777482-operator-scripts\") pod \"e7cb9538-4868-41e0-b3b8-67c31f777482\" (UID: \"e7cb9538-4868-41e0-b3b8-67c31f777482\") " Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.761318 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7cb9538-4868-41e0-b3b8-67c31f777482-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7cb9538-4868-41e0-b3b8-67c31f777482" (UID: "e7cb9538-4868-41e0-b3b8-67c31f777482"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.761329 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ed070d-7777-4cf1-b0c7-007059a78cc3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8ed070d-7777-4cf1-b0c7-007059a78cc3" (UID: "e8ed070d-7777-4cf1-b0c7-007059a78cc3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.771288 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7cb9538-4868-41e0-b3b8-67c31f777482-kube-api-access-fgt8w" (OuterVolumeSpecName: "kube-api-access-fgt8w") pod "e7cb9538-4868-41e0-b3b8-67c31f777482" (UID: "e7cb9538-4868-41e0-b3b8-67c31f777482"). InnerVolumeSpecName "kube-api-access-fgt8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.772681 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ed070d-7777-4cf1-b0c7-007059a78cc3-kube-api-access-88r67" (OuterVolumeSpecName: "kube-api-access-88r67") pod "e8ed070d-7777-4cf1-b0c7-007059a78cc3" (UID: "e8ed070d-7777-4cf1-b0c7-007059a78cc3"). InnerVolumeSpecName "kube-api-access-88r67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.862352 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88r67\" (UniqueName: \"kubernetes.io/projected/e8ed070d-7777-4cf1-b0c7-007059a78cc3-kube-api-access-88r67\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.862398 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgt8w\" (UniqueName: \"kubernetes.io/projected/e7cb9538-4868-41e0-b3b8-67c31f777482-kube-api-access-fgt8w\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.862413 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ed070d-7777-4cf1-b0c7-007059a78cc3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:38 crc kubenswrapper[4761]: I1201 10:47:38.862425 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7cb9538-4868-41e0-b3b8-67c31f777482-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:39 crc kubenswrapper[4761]: I1201 10:47:39.314132 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-rstz4" event={"ID":"e7cb9538-4868-41e0-b3b8-67c31f777482","Type":"ContainerDied","Data":"acabb9d6da20c0c84b5110cc784d40976805721e9f3313694ba8355ab13bcfdf"} Dec 01 10:47:39 crc kubenswrapper[4761]: I1201 10:47:39.314229 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acabb9d6da20c0c84b5110cc784d40976805721e9f3313694ba8355ab13bcfdf" Dec 01 10:47:39 crc kubenswrapper[4761]: I1201 10:47:39.314280 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-rstz4" Dec 01 10:47:39 crc kubenswrapper[4761]: I1201 10:47:39.316978 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" event={"ID":"e8ed070d-7777-4cf1-b0c7-007059a78cc3","Type":"ContainerDied","Data":"459f82e509b59ce6d901368af150bdb026c6cf99b76adc1ef6148db7a9e4dd7a"} Dec 01 10:47:39 crc kubenswrapper[4761]: I1201 10:47:39.317060 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="459f82e509b59ce6d901368af150bdb026c6cf99b76adc1ef6148db7a9e4dd7a" Dec 01 10:47:39 crc kubenswrapper[4761]: I1201 10:47:39.317139 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-529d-account-create-update-ng6qb" Dec 01 10:47:39 crc kubenswrapper[4761]: I1201 10:47:39.321442 4761 generic.go:334] "Generic (PLEG): container finished" podID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" containerID="922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6" exitCode=0 Dec 01 10:47:39 crc kubenswrapper[4761]: I1201 10:47:39.321504 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e07e5919-c158-40b5-a20d-6c07c7f98ecd","Type":"ContainerDied","Data":"922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6"} Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.330182 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e07e5919-c158-40b5-a20d-6c07c7f98ecd","Type":"ContainerStarted","Data":"2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8"} Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.331076 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.332118 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-v8vkt" event={"ID":"f837fbca-013a-4054-af8d-fcf798f568d0","Type":"ContainerStarted","Data":"3443cc9c7ca54e7716255c55fc31333a1bb974ec2fc5aa0b321555bba70d6a01"} Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.350441 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=38.532160483 podStartE2EDuration="46.350420014s" podCreationTimestamp="2025-12-01 10:46:54 +0000 UTC" firstStartedPulling="2025-12-01 10:46:57.731308912 +0000 UTC m=+957.035067526" lastFinishedPulling="2025-12-01 10:47:05.549568433 +0000 UTC m=+964.853327057" observedRunningTime="2025-12-01 10:47:40.347895106 +0000 UTC m=+999.651653730" watchObservedRunningTime="2025-12-01 10:47:40.350420014 +0000 UTC m=+999.654178638" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.376284 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-v8vkt" podStartSLOduration=1.965204121 podStartE2EDuration="3.376262239s" podCreationTimestamp="2025-12-01 10:47:37 +0000 UTC" firstStartedPulling="2025-12-01 10:47:38.092999755 +0000 UTC m=+997.396758379" lastFinishedPulling="2025-12-01 10:47:39.504057873 +0000 UTC m=+998.807816497" observedRunningTime="2025-12-01 10:47:40.368231793 +0000 UTC m=+999.671990427" watchObservedRunningTime="2025-12-01 10:47:40.376262239 +0000 UTC m=+999.680020883" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.865205 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-879zt"] Dec 01 10:47:40 crc kubenswrapper[4761]: E1201 10:47:40.865512 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cb9538-4868-41e0-b3b8-67c31f777482" containerName="mariadb-database-create" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.865527 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cb9538-4868-41e0-b3b8-67c31f777482" containerName="mariadb-database-create" Dec 01 10:47:40 crc kubenswrapper[4761]: E1201 10:47:40.865539 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ed070d-7777-4cf1-b0c7-007059a78cc3" containerName="mariadb-account-create-update" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.865552 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ed070d-7777-4cf1-b0c7-007059a78cc3" containerName="mariadb-account-create-update" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.865710 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7cb9538-4868-41e0-b3b8-67c31f777482" containerName="mariadb-database-create" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.865737 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ed070d-7777-4cf1-b0c7-007059a78cc3" containerName="mariadb-account-create-update" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.866389 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-879zt" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.868438 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-2bv6f" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.878661 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-879zt"] Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.892098 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp85x\" (UniqueName: \"kubernetes.io/projected/a19b5c4f-08b9-42e4-82e8-e965d992e10e-kube-api-access-sp85x\") pod \"swift-operator-index-879zt\" (UID: \"a19b5c4f-08b9-42e4-82e8-e965d992e10e\") " pod="openstack-operators/swift-operator-index-879zt" Dec 01 10:47:40 crc kubenswrapper[4761]: I1201 10:47:40.994661 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp85x\" (UniqueName: \"kubernetes.io/projected/a19b5c4f-08b9-42e4-82e8-e965d992e10e-kube-api-access-sp85x\") pod \"swift-operator-index-879zt\" (UID: \"a19b5c4f-08b9-42e4-82e8-e965d992e10e\") " pod="openstack-operators/swift-operator-index-879zt" Dec 01 10:47:41 crc kubenswrapper[4761]: I1201 10:47:41.018652 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp85x\" (UniqueName: \"kubernetes.io/projected/a19b5c4f-08b9-42e4-82e8-e965d992e10e-kube-api-access-sp85x\") pod \"swift-operator-index-879zt\" (UID: \"a19b5c4f-08b9-42e4-82e8-e965d992e10e\") " pod="openstack-operators/swift-operator-index-879zt" Dec 01 10:47:41 crc kubenswrapper[4761]: I1201 10:47:41.189729 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-879zt" Dec 01 10:47:41 crc kubenswrapper[4761]: I1201 10:47:41.661524 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-879zt"] Dec 01 10:47:41 crc kubenswrapper[4761]: W1201 10:47:41.664764 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda19b5c4f_08b9_42e4_82e8_e965d992e10e.slice/crio-c57b98161b598cf7013cb4c6832cdab35ef953bf9d85a00d32c7a3183b765b19 WatchSource:0}: Error finding container c57b98161b598cf7013cb4c6832cdab35ef953bf9d85a00d32c7a3183b765b19: Status 404 returned error can't find the container with id c57b98161b598cf7013cb4c6832cdab35ef953bf9d85a00d32c7a3183b765b19 Dec 01 10:47:42 crc kubenswrapper[4761]: I1201 10:47:42.348400 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-879zt" event={"ID":"a19b5c4f-08b9-42e4-82e8-e965d992e10e","Type":"ContainerStarted","Data":"c57b98161b598cf7013cb4c6832cdab35ef953bf9d85a00d32c7a3183b765b19"} Dec 01 10:47:43 crc kubenswrapper[4761]: I1201 10:47:43.358107 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-879zt" event={"ID":"a19b5c4f-08b9-42e4-82e8-e965d992e10e","Type":"ContainerStarted","Data":"f5427f44b0705eb16c1bf87a3da1c7f284481fdce293926378effaf68d029331"} Dec 01 10:47:43 crc kubenswrapper[4761]: I1201 10:47:43.381436 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-879zt" podStartSLOduration=2.331167857 podStartE2EDuration="3.381420334s" podCreationTimestamp="2025-12-01 10:47:40 +0000 UTC" firstStartedPulling="2025-12-01 10:47:41.666442775 +0000 UTC m=+1000.970201399" lastFinishedPulling="2025-12-01 10:47:42.716695242 +0000 UTC m=+1002.020453876" observedRunningTime="2025-12-01 10:47:43.375128024 +0000 UTC m=+1002.678886688" watchObservedRunningTime="2025-12-01 10:47:43.381420334 +0000 UTC m=+1002.685178958" Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.055633 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-879zt"] Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.056852 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-879zt" podUID="a19b5c4f-08b9-42e4-82e8-e965d992e10e" containerName="registry-server" containerID="cri-o://f5427f44b0705eb16c1bf87a3da1c7f284481fdce293926378effaf68d029331" gracePeriod=2 Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.378212 4761 generic.go:334] "Generic (PLEG): container finished" podID="a19b5c4f-08b9-42e4-82e8-e965d992e10e" containerID="f5427f44b0705eb16c1bf87a3da1c7f284481fdce293926378effaf68d029331" exitCode=0 Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.378305 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-879zt" event={"ID":"a19b5c4f-08b9-42e4-82e8-e965d992e10e","Type":"ContainerDied","Data":"f5427f44b0705eb16c1bf87a3da1c7f284481fdce293926378effaf68d029331"} Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.441178 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-879zt" Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.569829 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp85x\" (UniqueName: \"kubernetes.io/projected/a19b5c4f-08b9-42e4-82e8-e965d992e10e-kube-api-access-sp85x\") pod \"a19b5c4f-08b9-42e4-82e8-e965d992e10e\" (UID: \"a19b5c4f-08b9-42e4-82e8-e965d992e10e\") " Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.576064 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19b5c4f-08b9-42e4-82e8-e965d992e10e-kube-api-access-sp85x" (OuterVolumeSpecName: "kube-api-access-sp85x") pod "a19b5c4f-08b9-42e4-82e8-e965d992e10e" (UID: "a19b5c4f-08b9-42e4-82e8-e965d992e10e"). InnerVolumeSpecName "kube-api-access-sp85x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.671838 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp85x\" (UniqueName: \"kubernetes.io/projected/a19b5c4f-08b9-42e4-82e8-e965d992e10e-kube-api-access-sp85x\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.885897 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-stmvr"] Dec 01 10:47:46 crc kubenswrapper[4761]: E1201 10:47:46.886291 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19b5c4f-08b9-42e4-82e8-e965d992e10e" containerName="registry-server" Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.886313 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19b5c4f-08b9-42e4-82e8-e965d992e10e" containerName="registry-server" Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.886491 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19b5c4f-08b9-42e4-82e8-e965d992e10e" containerName="registry-server" Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.887107 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.893756 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-stmvr"] Dec 01 10:47:46 crc kubenswrapper[4761]: I1201 10:47:46.980594 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvm78\" (UniqueName: \"kubernetes.io/projected/c3153419-6c29-4301-a072-acfcee97b630-kube-api-access-lvm78\") pod \"swift-operator-index-stmvr\" (UID: \"c3153419-6c29-4301-a072-acfcee97b630\") " pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.082282 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvm78\" (UniqueName: \"kubernetes.io/projected/c3153419-6c29-4301-a072-acfcee97b630-kube-api-access-lvm78\") pod \"swift-operator-index-stmvr\" (UID: \"c3153419-6c29-4301-a072-acfcee97b630\") " pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.099553 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvm78\" (UniqueName: \"kubernetes.io/projected/c3153419-6c29-4301-a072-acfcee97b630-kube-api-access-lvm78\") pod \"swift-operator-index-stmvr\" (UID: \"c3153419-6c29-4301-a072-acfcee97b630\") " pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.210161 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.389050 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-879zt" event={"ID":"a19b5c4f-08b9-42e4-82e8-e965d992e10e","Type":"ContainerDied","Data":"c57b98161b598cf7013cb4c6832cdab35ef953bf9d85a00d32c7a3183b765b19"} Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.389348 4761 scope.go:117] "RemoveContainer" containerID="f5427f44b0705eb16c1bf87a3da1c7f284481fdce293926378effaf68d029331" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.389162 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-879zt" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.427582 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-879zt"] Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.438085 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-879zt"] Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.629394 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.629436 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.695963 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:47 crc kubenswrapper[4761]: I1201 10:47:47.728071 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-stmvr"] Dec 01 10:47:47 crc kubenswrapper[4761]: W1201 10:47:47.735270 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3153419_6c29_4301_a072_acfcee97b630.slice/crio-e8c00ab72024931b0fc319fe2e709529335cd613fece2c96474e37b850d1a02e WatchSource:0}: Error finding container e8c00ab72024931b0fc319fe2e709529335cd613fece2c96474e37b850d1a02e: Status 404 returned error can't find the container with id e8c00ab72024931b0fc319fe2e709529335cd613fece2c96474e37b850d1a02e Dec 01 10:47:48 crc kubenswrapper[4761]: I1201 10:47:48.396712 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-stmvr" event={"ID":"c3153419-6c29-4301-a072-acfcee97b630","Type":"ContainerStarted","Data":"d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673"} Dec 01 10:47:48 crc kubenswrapper[4761]: I1201 10:47:48.397046 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-stmvr" event={"ID":"c3153419-6c29-4301-a072-acfcee97b630","Type":"ContainerStarted","Data":"e8c00ab72024931b0fc319fe2e709529335cd613fece2c96474e37b850d1a02e"} Dec 01 10:47:48 crc kubenswrapper[4761]: I1201 10:47:48.411766 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-stmvr" podStartSLOduration=1.89318959 podStartE2EDuration="2.411746677s" podCreationTimestamp="2025-12-01 10:47:46 +0000 UTC" firstStartedPulling="2025-12-01 10:47:47.740347746 +0000 UTC m=+1007.044106370" lastFinishedPulling="2025-12-01 10:47:48.258904833 +0000 UTC m=+1007.562663457" observedRunningTime="2025-12-01 10:47:48.41000003 +0000 UTC m=+1007.713758644" watchObservedRunningTime="2025-12-01 10:47:48.411746677 +0000 UTC m=+1007.715505291" Dec 01 10:47:48 crc kubenswrapper[4761]: I1201 10:47:48.428234 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-v8vkt" Dec 01 10:47:49 crc kubenswrapper[4761]: I1201 10:47:49.136175 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19b5c4f-08b9-42e4-82e8-e965d992e10e" path="/var/lib/kubelet/pods/a19b5c4f-08b9-42e4-82e8-e965d992e10e/volumes" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.211455 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.212437 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.229908 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.261069 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.526903 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.698456 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-j9f8v"] Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.699221 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.701116 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.701332 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-x8f27" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.701459 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.701591 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.715537 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-j9f8v"] Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.843744 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmk8q\" (UniqueName: \"kubernetes.io/projected/92aa0c50-5322-495e-b8cf-6b6fc22813f8-kube-api-access-vmk8q\") pod \"keystone-db-sync-j9f8v\" (UID: \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\") " pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.843808 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa0c50-5322-495e-b8cf-6b6fc22813f8-config-data\") pod \"keystone-db-sync-j9f8v\" (UID: \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\") " pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.945259 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa0c50-5322-495e-b8cf-6b6fc22813f8-config-data\") pod \"keystone-db-sync-j9f8v\" (UID: \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\") " pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.945364 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmk8q\" (UniqueName: \"kubernetes.io/projected/92aa0c50-5322-495e-b8cf-6b6fc22813f8-kube-api-access-vmk8q\") pod \"keystone-db-sync-j9f8v\" (UID: \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\") " pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.963336 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmk8q\" (UniqueName: \"kubernetes.io/projected/92aa0c50-5322-495e-b8cf-6b6fc22813f8-kube-api-access-vmk8q\") pod \"keystone-db-sync-j9f8v\" (UID: \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\") " pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:47:57 crc kubenswrapper[4761]: I1201 10:47:57.963354 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa0c50-5322-495e-b8cf-6b6fc22813f8-config-data\") pod \"keystone-db-sync-j9f8v\" (UID: \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\") " pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:47:58 crc kubenswrapper[4761]: I1201 10:47:58.058744 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:47:58 crc kubenswrapper[4761]: I1201 10:47:58.550034 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-j9f8v"] Dec 01 10:47:58 crc kubenswrapper[4761]: W1201 10:47:58.552285 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92aa0c50_5322_495e_b8cf_6b6fc22813f8.slice/crio-30be7a982cba5295a66660031716fb20121cf53047e89863b47ec92b29ab9b85 WatchSource:0}: Error finding container 30be7a982cba5295a66660031716fb20121cf53047e89863b47ec92b29ab9b85: Status 404 returned error can't find the container with id 30be7a982cba5295a66660031716fb20121cf53047e89863b47ec92b29ab9b85 Dec 01 10:47:58 crc kubenswrapper[4761]: I1201 10:47:58.554393 4761 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:47:59 crc kubenswrapper[4761]: I1201 10:47:59.499982 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" event={"ID":"92aa0c50-5322-495e-b8cf-6b6fc22813f8","Type":"ContainerStarted","Data":"30be7a982cba5295a66660031716fb20121cf53047e89863b47ec92b29ab9b85"} Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.297436 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x"] Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.299345 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.309852 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8w9gk" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.311981 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x"] Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.383479 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.383539 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.383648 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhg4\" (UniqueName: \"kubernetes.io/projected/b97b53a1-4f2c-457c-8d54-af349c67f688-kube-api-access-fjhg4\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.485436 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.485531 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhg4\" (UniqueName: \"kubernetes.io/projected/b97b53a1-4f2c-457c-8d54-af349c67f688-kube-api-access-fjhg4\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.485682 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.486302 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.486386 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.522363 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhg4\" (UniqueName: \"kubernetes.io/projected/b97b53a1-4f2c-457c-8d54-af349c67f688-kube-api-access-fjhg4\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:00 crc kubenswrapper[4761]: I1201 10:48:00.652442 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.143043 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x"] Dec 01 10:48:01 crc kubenswrapper[4761]: W1201 10:48:01.158160 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb97b53a1_4f2c_457c_8d54_af349c67f688.slice/crio-83300a7068ff5e60742f1c9168a7b7e4a5d3af91401a6ccb611cab6a5852f9cc WatchSource:0}: Error finding container 83300a7068ff5e60742f1c9168a7b7e4a5d3af91401a6ccb611cab6a5852f9cc: Status 404 returned error can't find the container with id 83300a7068ff5e60742f1c9168a7b7e4a5d3af91401a6ccb611cab6a5852f9cc Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.298505 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb"] Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.300075 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.310523 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb"] Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.359953 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxw2m\" (UniqueName: \"kubernetes.io/projected/495e601e-4706-49fe-b2c9-5d7eb14bf566-kube-api-access-xxw2m\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.360331 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.360376 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.461860 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.462005 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxw2m\" (UniqueName: \"kubernetes.io/projected/495e601e-4706-49fe-b2c9-5d7eb14bf566-kube-api-access-xxw2m\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.462046 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.462807 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.462843 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.501467 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxw2m\" (UniqueName: \"kubernetes.io/projected/495e601e-4706-49fe-b2c9-5d7eb14bf566-kube-api-access-xxw2m\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.516228 4761 generic.go:334] "Generic (PLEG): container finished" podID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerID="29f4224b52e7c88ee9c0d605fe7996142ea3430cef35a987e9c44709004ad1a7" exitCode=0 Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.516291 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" event={"ID":"b97b53a1-4f2c-457c-8d54-af349c67f688","Type":"ContainerDied","Data":"29f4224b52e7c88ee9c0d605fe7996142ea3430cef35a987e9c44709004ad1a7"} Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.516321 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" event={"ID":"b97b53a1-4f2c-457c-8d54-af349c67f688","Type":"ContainerStarted","Data":"83300a7068ff5e60742f1c9168a7b7e4a5d3af91401a6ccb611cab6a5852f9cc"} Dec 01 10:48:01 crc kubenswrapper[4761]: I1201 10:48:01.628901 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:02 crc kubenswrapper[4761]: I1201 10:48:02.117520 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb"] Dec 01 10:48:02 crc kubenswrapper[4761]: W1201 10:48:02.128696 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod495e601e_4706_49fe_b2c9_5d7eb14bf566.slice/crio-a67df0fe8b8039fcf6551bd3541f3831053682f16c4130771368d5112ea532f4 WatchSource:0}: Error finding container a67df0fe8b8039fcf6551bd3541f3831053682f16c4130771368d5112ea532f4: Status 404 returned error can't find the container with id a67df0fe8b8039fcf6551bd3541f3831053682f16c4130771368d5112ea532f4 Dec 01 10:48:02 crc kubenswrapper[4761]: I1201 10:48:02.525822 4761 generic.go:334] "Generic (PLEG): container finished" podID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerID="3087bb4c94619cfbd60bbfe760dfc2942ab2b1baaf310a4da24660c880808b2f" exitCode=0 Dec 01 10:48:02 crc kubenswrapper[4761]: I1201 10:48:02.525918 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" event={"ID":"495e601e-4706-49fe-b2c9-5d7eb14bf566","Type":"ContainerDied","Data":"3087bb4c94619cfbd60bbfe760dfc2942ab2b1baaf310a4da24660c880808b2f"} Dec 01 10:48:02 crc kubenswrapper[4761]: I1201 10:48:02.526218 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" event={"ID":"495e601e-4706-49fe-b2c9-5d7eb14bf566","Type":"ContainerStarted","Data":"a67df0fe8b8039fcf6551bd3541f3831053682f16c4130771368d5112ea532f4"} Dec 01 10:48:03 crc kubenswrapper[4761]: I1201 10:48:03.850240 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:48:03 crc kubenswrapper[4761]: I1201 10:48:03.850313 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:48:11 crc kubenswrapper[4761]: I1201 10:48:11.596033 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" event={"ID":"92aa0c50-5322-495e-b8cf-6b6fc22813f8","Type":"ContainerStarted","Data":"d76403c1539eb5763266fa719c150d040730bba08c60e3759b09e5da1b68c987"} Dec 01 10:48:11 crc kubenswrapper[4761]: I1201 10:48:11.601621 4761 generic.go:334] "Generic (PLEG): container finished" podID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerID="67559ef6e20c7e3f9fa5a9330e61139a7e287718bd4441ffd67765a20aa97053" exitCode=0 Dec 01 10:48:11 crc kubenswrapper[4761]: I1201 10:48:11.601684 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" event={"ID":"b97b53a1-4f2c-457c-8d54-af349c67f688","Type":"ContainerDied","Data":"67559ef6e20c7e3f9fa5a9330e61139a7e287718bd4441ffd67765a20aa97053"} Dec 01 10:48:11 crc kubenswrapper[4761]: I1201 10:48:11.606034 4761 generic.go:334] "Generic (PLEG): container finished" podID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerID="a2a9b29e337986a18d243476a36bbcd121a99b7291bcdb0a7aaef90de906d052" exitCode=0 Dec 01 10:48:11 crc kubenswrapper[4761]: I1201 10:48:11.606080 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" event={"ID":"495e601e-4706-49fe-b2c9-5d7eb14bf566","Type":"ContainerDied","Data":"a2a9b29e337986a18d243476a36bbcd121a99b7291bcdb0a7aaef90de906d052"} Dec 01 10:48:11 crc kubenswrapper[4761]: I1201 10:48:11.618928 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" podStartSLOduration=2.086110071 podStartE2EDuration="14.618901162s" podCreationTimestamp="2025-12-01 10:47:57 +0000 UTC" firstStartedPulling="2025-12-01 10:47:58.554214564 +0000 UTC m=+1017.857973188" lastFinishedPulling="2025-12-01 10:48:11.087005625 +0000 UTC m=+1030.390764279" observedRunningTime="2025-12-01 10:48:11.61511056 +0000 UTC m=+1030.918869194" watchObservedRunningTime="2025-12-01 10:48:11.618901162 +0000 UTC m=+1030.922659836" Dec 01 10:48:12 crc kubenswrapper[4761]: I1201 10:48:12.614916 4761 generic.go:334] "Generic (PLEG): container finished" podID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerID="5f2e6bd6e8d833a4caab02c93a0c3eeea267d485d18916a177ca22f00bbed88d" exitCode=0 Dec 01 10:48:12 crc kubenswrapper[4761]: I1201 10:48:12.615044 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" event={"ID":"b97b53a1-4f2c-457c-8d54-af349c67f688","Type":"ContainerDied","Data":"5f2e6bd6e8d833a4caab02c93a0c3eeea267d485d18916a177ca22f00bbed88d"} Dec 01 10:48:12 crc kubenswrapper[4761]: I1201 10:48:12.618444 4761 generic.go:334] "Generic (PLEG): container finished" podID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerID="4117e8a839798d1372a47df36bd00d5bf0ba6c25e23380deda04eae81864d79b" exitCode=0 Dec 01 10:48:12 crc kubenswrapper[4761]: I1201 10:48:12.619314 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" event={"ID":"495e601e-4706-49fe-b2c9-5d7eb14bf566","Type":"ContainerDied","Data":"4117e8a839798d1372a47df36bd00d5bf0ba6c25e23380deda04eae81864d79b"} Dec 01 10:48:13 crc kubenswrapper[4761]: I1201 10:48:13.938425 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:13 crc kubenswrapper[4761]: I1201 10:48:13.942614 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.044923 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-bundle\") pod \"b97b53a1-4f2c-457c-8d54-af349c67f688\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.044980 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-bundle\") pod \"495e601e-4706-49fe-b2c9-5d7eb14bf566\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.045042 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-util\") pod \"495e601e-4706-49fe-b2c9-5d7eb14bf566\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.045086 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-util\") pod \"b97b53a1-4f2c-457c-8d54-af349c67f688\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.045112 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxw2m\" (UniqueName: \"kubernetes.io/projected/495e601e-4706-49fe-b2c9-5d7eb14bf566-kube-api-access-xxw2m\") pod \"495e601e-4706-49fe-b2c9-5d7eb14bf566\" (UID: \"495e601e-4706-49fe-b2c9-5d7eb14bf566\") " Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.045158 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjhg4\" (UniqueName: \"kubernetes.io/projected/b97b53a1-4f2c-457c-8d54-af349c67f688-kube-api-access-fjhg4\") pod \"b97b53a1-4f2c-457c-8d54-af349c67f688\" (UID: \"b97b53a1-4f2c-457c-8d54-af349c67f688\") " Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.046219 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-bundle" (OuterVolumeSpecName: "bundle") pod "b97b53a1-4f2c-457c-8d54-af349c67f688" (UID: "b97b53a1-4f2c-457c-8d54-af349c67f688"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.046411 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-bundle" (OuterVolumeSpecName: "bundle") pod "495e601e-4706-49fe-b2c9-5d7eb14bf566" (UID: "495e601e-4706-49fe-b2c9-5d7eb14bf566"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.050630 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495e601e-4706-49fe-b2c9-5d7eb14bf566-kube-api-access-xxw2m" (OuterVolumeSpecName: "kube-api-access-xxw2m") pod "495e601e-4706-49fe-b2c9-5d7eb14bf566" (UID: "495e601e-4706-49fe-b2c9-5d7eb14bf566"). InnerVolumeSpecName "kube-api-access-xxw2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.051929 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97b53a1-4f2c-457c-8d54-af349c67f688-kube-api-access-fjhg4" (OuterVolumeSpecName: "kube-api-access-fjhg4") pod "b97b53a1-4f2c-457c-8d54-af349c67f688" (UID: "b97b53a1-4f2c-457c-8d54-af349c67f688"). InnerVolumeSpecName "kube-api-access-fjhg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.055014 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-util" (OuterVolumeSpecName: "util") pod "495e601e-4706-49fe-b2c9-5d7eb14bf566" (UID: "495e601e-4706-49fe-b2c9-5d7eb14bf566"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.057793 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-util" (OuterVolumeSpecName: "util") pod "b97b53a1-4f2c-457c-8d54-af349c67f688" (UID: "b97b53a1-4f2c-457c-8d54-af349c67f688"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.146058 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxw2m\" (UniqueName: \"kubernetes.io/projected/495e601e-4706-49fe-b2c9-5d7eb14bf566-kube-api-access-xxw2m\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.146249 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjhg4\" (UniqueName: \"kubernetes.io/projected/b97b53a1-4f2c-457c-8d54-af349c67f688-kube-api-access-fjhg4\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.146259 4761 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.146267 4761 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.146276 4761 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/495e601e-4706-49fe-b2c9-5d7eb14bf566-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.146284 4761 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b97b53a1-4f2c-457c-8d54-af349c67f688-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.635878 4761 generic.go:334] "Generic (PLEG): container finished" podID="92aa0c50-5322-495e-b8cf-6b6fc22813f8" containerID="d76403c1539eb5763266fa719c150d040730bba08c60e3759b09e5da1b68c987" exitCode=0 Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.635965 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" event={"ID":"92aa0c50-5322-495e-b8cf-6b6fc22813f8","Type":"ContainerDied","Data":"d76403c1539eb5763266fa719c150d040730bba08c60e3759b09e5da1b68c987"} Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.641762 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" event={"ID":"b97b53a1-4f2c-457c-8d54-af349c67f688","Type":"ContainerDied","Data":"83300a7068ff5e60742f1c9168a7b7e4a5d3af91401a6ccb611cab6a5852f9cc"} Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.641800 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83300a7068ff5e60742f1c9168a7b7e4a5d3af91401a6ccb611cab6a5852f9cc" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.641862 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.644637 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" event={"ID":"495e601e-4706-49fe-b2c9-5d7eb14bf566","Type":"ContainerDied","Data":"a67df0fe8b8039fcf6551bd3541f3831053682f16c4130771368d5112ea532f4"} Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.644680 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a67df0fe8b8039fcf6551bd3541f3831053682f16c4130771368d5112ea532f4" Dec 01 10:48:14 crc kubenswrapper[4761]: I1201 10:48:14.644788 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb" Dec 01 10:48:14 crc kubenswrapper[4761]: E1201 10:48:14.699302 4761 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb97b53a1_4f2c_457c_8d54_af349c67f688.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:48:15 crc kubenswrapper[4761]: I1201 10:48:15.913411 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.079170 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa0c50-5322-495e-b8cf-6b6fc22813f8-config-data\") pod \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\" (UID: \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\") " Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.079235 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmk8q\" (UniqueName: \"kubernetes.io/projected/92aa0c50-5322-495e-b8cf-6b6fc22813f8-kube-api-access-vmk8q\") pod \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\" (UID: \"92aa0c50-5322-495e-b8cf-6b6fc22813f8\") " Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.084813 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92aa0c50-5322-495e-b8cf-6b6fc22813f8-kube-api-access-vmk8q" (OuterVolumeSpecName: "kube-api-access-vmk8q") pod "92aa0c50-5322-495e-b8cf-6b6fc22813f8" (UID: "92aa0c50-5322-495e-b8cf-6b6fc22813f8"). InnerVolumeSpecName "kube-api-access-vmk8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.126923 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92aa0c50-5322-495e-b8cf-6b6fc22813f8-config-data" (OuterVolumeSpecName: "config-data") pod "92aa0c50-5322-495e-b8cf-6b6fc22813f8" (UID: "92aa0c50-5322-495e-b8cf-6b6fc22813f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.181152 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmk8q\" (UniqueName: \"kubernetes.io/projected/92aa0c50-5322-495e-b8cf-6b6fc22813f8-kube-api-access-vmk8q\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.181220 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92aa0c50-5322-495e-b8cf-6b6fc22813f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.659720 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" event={"ID":"92aa0c50-5322-495e-b8cf-6b6fc22813f8","Type":"ContainerDied","Data":"30be7a982cba5295a66660031716fb20121cf53047e89863b47ec92b29ab9b85"} Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.659762 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30be7a982cba5295a66660031716fb20121cf53047e89863b47ec92b29ab9b85" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.659826 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-j9f8v" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.857854 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-75g7j"] Dec 01 10:48:16 crc kubenswrapper[4761]: E1201 10:48:16.858152 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerName="extract" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858173 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerName="extract" Dec 01 10:48:16 crc kubenswrapper[4761]: E1201 10:48:16.858191 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerName="util" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858219 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerName="util" Dec 01 10:48:16 crc kubenswrapper[4761]: E1201 10:48:16.858232 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerName="util" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858240 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerName="util" Dec 01 10:48:16 crc kubenswrapper[4761]: E1201 10:48:16.858254 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerName="pull" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858262 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerName="pull" Dec 01 10:48:16 crc kubenswrapper[4761]: E1201 10:48:16.858278 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerName="pull" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858286 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerName="pull" Dec 01 10:48:16 crc kubenswrapper[4761]: E1201 10:48:16.858303 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerName="extract" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858311 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerName="extract" Dec 01 10:48:16 crc kubenswrapper[4761]: E1201 10:48:16.858337 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92aa0c50-5322-495e-b8cf-6b6fc22813f8" containerName="keystone-db-sync" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858345 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="92aa0c50-5322-495e-b8cf-6b6fc22813f8" containerName="keystone-db-sync" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858482 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97b53a1-4f2c-457c-8d54-af349c67f688" containerName="extract" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858499 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="495e601e-4706-49fe-b2c9-5d7eb14bf566" containerName="extract" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.858511 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="92aa0c50-5322-495e-b8cf-6b6fc22813f8" containerName="keystone-db-sync" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.859051 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.862052 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.862263 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.862310 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.862414 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.862467 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-x8f27" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.875546 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-75g7j"] Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.992670 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-scripts\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.992794 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-credential-keys\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.992863 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlvkz\" (UniqueName: \"kubernetes.io/projected/1fb8eee3-094e-4a75-b41e-5183c5f09278-kube-api-access-wlvkz\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.993023 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-fernet-keys\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:16 crc kubenswrapper[4761]: I1201 10:48:16.993061 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-config-data\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.094230 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-credential-keys\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.094301 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlvkz\" (UniqueName: \"kubernetes.io/projected/1fb8eee3-094e-4a75-b41e-5183c5f09278-kube-api-access-wlvkz\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.094366 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-fernet-keys\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.094389 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-config-data\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.094424 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-scripts\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.099366 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-scripts\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.099966 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-fernet-keys\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.100760 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-credential-keys\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.105154 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-config-data\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.108834 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlvkz\" (UniqueName: \"kubernetes.io/projected/1fb8eee3-094e-4a75-b41e-5183c5f09278-kube-api-access-wlvkz\") pod \"keystone-bootstrap-75g7j\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.177037 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.590408 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-75g7j"] Dec 01 10:48:17 crc kubenswrapper[4761]: W1201 10:48:17.595482 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fb8eee3_094e_4a75_b41e_5183c5f09278.slice/crio-2e654ddad9b29f6db1734620b219ad5933a61201ec3207508b34c67fc0a2a4f8 WatchSource:0}: Error finding container 2e654ddad9b29f6db1734620b219ad5933a61201ec3207508b34c67fc0a2a4f8: Status 404 returned error can't find the container with id 2e654ddad9b29f6db1734620b219ad5933a61201ec3207508b34c67fc0a2a4f8 Dec 01 10:48:17 crc kubenswrapper[4761]: I1201 10:48:17.670460 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" event={"ID":"1fb8eee3-094e-4a75-b41e-5183c5f09278","Type":"ContainerStarted","Data":"2e654ddad9b29f6db1734620b219ad5933a61201ec3207508b34c67fc0a2a4f8"} Dec 01 10:48:18 crc kubenswrapper[4761]: I1201 10:48:18.680031 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" event={"ID":"1fb8eee3-094e-4a75-b41e-5183c5f09278","Type":"ContainerStarted","Data":"bc35d40c83fc14655090b1b0ad4e92e1ceca84962cc0e9572df8655009fe3c37"} Dec 01 10:48:18 crc kubenswrapper[4761]: I1201 10:48:18.695411 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" podStartSLOduration=2.695392577 podStartE2EDuration="2.695392577s" podCreationTimestamp="2025-12-01 10:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:18.694587265 +0000 UTC m=+1037.998345889" watchObservedRunningTime="2025-12-01 10:48:18.695392577 +0000 UTC m=+1037.999151201" Dec 01 10:48:21 crc kubenswrapper[4761]: I1201 10:48:21.706755 4761 generic.go:334] "Generic (PLEG): container finished" podID="1fb8eee3-094e-4a75-b41e-5183c5f09278" containerID="bc35d40c83fc14655090b1b0ad4e92e1ceca84962cc0e9572df8655009fe3c37" exitCode=0 Dec 01 10:48:21 crc kubenswrapper[4761]: I1201 10:48:21.706868 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" event={"ID":"1fb8eee3-094e-4a75-b41e-5183c5f09278","Type":"ContainerDied","Data":"bc35d40c83fc14655090b1b0ad4e92e1ceca84962cc0e9572df8655009fe3c37"} Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.440760 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467"] Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.450831 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467"] Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.450946 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.456366 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.457383 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-srq6k" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.568866 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.590890 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-webhook-cert\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.590940 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2rd\" (UniqueName: \"kubernetes.io/projected/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-kube-api-access-2z2rd\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.590974 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-apiservice-cert\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.692341 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-credential-keys\") pod \"1fb8eee3-094e-4a75-b41e-5183c5f09278\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.692408 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-scripts\") pod \"1fb8eee3-094e-4a75-b41e-5183c5f09278\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.692510 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlvkz\" (UniqueName: \"kubernetes.io/projected/1fb8eee3-094e-4a75-b41e-5183c5f09278-kube-api-access-wlvkz\") pod \"1fb8eee3-094e-4a75-b41e-5183c5f09278\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.692539 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-fernet-keys\") pod \"1fb8eee3-094e-4a75-b41e-5183c5f09278\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.692570 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-config-data\") pod \"1fb8eee3-094e-4a75-b41e-5183c5f09278\" (UID: \"1fb8eee3-094e-4a75-b41e-5183c5f09278\") " Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.692729 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-apiservice-cert\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.692857 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-webhook-cert\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.692898 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2rd\" (UniqueName: \"kubernetes.io/projected/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-kube-api-access-2z2rd\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.697760 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1fb8eee3-094e-4a75-b41e-5183c5f09278" (UID: "1fb8eee3-094e-4a75-b41e-5183c5f09278"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.698227 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1fb8eee3-094e-4a75-b41e-5183c5f09278" (UID: "1fb8eee3-094e-4a75-b41e-5183c5f09278"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.698240 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-scripts" (OuterVolumeSpecName: "scripts") pod "1fb8eee3-094e-4a75-b41e-5183c5f09278" (UID: "1fb8eee3-094e-4a75-b41e-5183c5f09278"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.698814 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb8eee3-094e-4a75-b41e-5183c5f09278-kube-api-access-wlvkz" (OuterVolumeSpecName: "kube-api-access-wlvkz") pod "1fb8eee3-094e-4a75-b41e-5183c5f09278" (UID: "1fb8eee3-094e-4a75-b41e-5183c5f09278"). InnerVolumeSpecName "kube-api-access-wlvkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.701317 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-webhook-cert\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.712862 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2rd\" (UniqueName: \"kubernetes.io/projected/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-kube-api-access-2z2rd\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.714071 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3915eec-1b53-4ec3-b44c-ead2e1fdfe03-apiservice-cert\") pod \"horizon-operator-controller-manager-68949bdcb7-pd467\" (UID: \"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03\") " pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.719323 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-config-data" (OuterVolumeSpecName: "config-data") pod "1fb8eee3-094e-4a75-b41e-5183c5f09278" (UID: "1fb8eee3-094e-4a75-b41e-5183c5f09278"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.727004 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" event={"ID":"1fb8eee3-094e-4a75-b41e-5183c5f09278","Type":"ContainerDied","Data":"2e654ddad9b29f6db1734620b219ad5933a61201ec3207508b34c67fc0a2a4f8"} Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.727179 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e654ddad9b29f6db1734620b219ad5933a61201ec3207508b34c67fc0a2a4f8" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.727319 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-75g7j" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.772426 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.796221 4761 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.796428 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.796443 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlvkz\" (UniqueName: \"kubernetes.io/projected/1fb8eee3-094e-4a75-b41e-5183c5f09278-kube-api-access-wlvkz\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.796457 4761 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.796467 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb8eee3-094e-4a75-b41e-5183c5f09278-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.809584 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-69d7456d48-pnj5v"] Dec 01 10:48:23 crc kubenswrapper[4761]: E1201 10:48:23.810119 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb8eee3-094e-4a75-b41e-5183c5f09278" containerName="keystone-bootstrap" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.810137 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb8eee3-094e-4a75-b41e-5183c5f09278" containerName="keystone-bootstrap" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.810249 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb8eee3-094e-4a75-b41e-5183c5f09278" containerName="keystone-bootstrap" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.810800 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.815257 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-x8f27" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.815457 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.815521 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.815675 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.817883 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-69d7456d48-pnj5v"] Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.924150 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-scripts\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.924211 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-config-data\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.924257 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-fernet-keys\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.924282 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-credential-keys\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:23 crc kubenswrapper[4761]: I1201 10:48:23.924336 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv52t\" (UniqueName: \"kubernetes.io/projected/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-kube-api-access-bv52t\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.026557 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-fernet-keys\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.026604 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-credential-keys\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.026653 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv52t\" (UniqueName: \"kubernetes.io/projected/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-kube-api-access-bv52t\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.026682 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-scripts\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.026718 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-config-data\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.033085 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-scripts\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.033418 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-fernet-keys\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.048129 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-credential-keys\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.048647 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-config-data\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.058520 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv52t\" (UniqueName: \"kubernetes.io/projected/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-kube-api-access-bv52t\") pod \"keystone-69d7456d48-pnj5v\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.089128 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467"] Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.142893 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.575153 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-69d7456d48-pnj5v"] Dec 01 10:48:24 crc kubenswrapper[4761]: W1201 10:48:24.580248 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded6e2a9d_eafc_42c7_8e81_9d5c5760c81c.slice/crio-430152392fa5cf915d1b9d6992b2f45c3790d7905fde375154879f3c0a12d059 WatchSource:0}: Error finding container 430152392fa5cf915d1b9d6992b2f45c3790d7905fde375154879f3c0a12d059: Status 404 returned error can't find the container with id 430152392fa5cf915d1b9d6992b2f45c3790d7905fde375154879f3c0a12d059 Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.733437 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" event={"ID":"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c","Type":"ContainerStarted","Data":"7b8b65954d8565b8b5a54a846d325c91d0722d4f767aceb447fad866dfdefbc3"} Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.733779 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" event={"ID":"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c","Type":"ContainerStarted","Data":"430152392fa5cf915d1b9d6992b2f45c3790d7905fde375154879f3c0a12d059"} Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.734912 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.739009 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" event={"ID":"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03","Type":"ContainerStarted","Data":"90eed3eb26039835d3780f8c054542505e224d1d7427b6b58e90ffaa5a776018"} Dec 01 10:48:24 crc kubenswrapper[4761]: I1201 10:48:24.761961 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" podStartSLOduration=1.76193617 podStartE2EDuration="1.76193617s" podCreationTimestamp="2025-12-01 10:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:24.757724447 +0000 UTC m=+1044.061483071" watchObservedRunningTime="2025-12-01 10:48:24.76193617 +0000 UTC m=+1044.065694794" Dec 01 10:48:26 crc kubenswrapper[4761]: I1201 10:48:26.759103 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" event={"ID":"c3915eec-1b53-4ec3-b44c-ead2e1fdfe03","Type":"ContainerStarted","Data":"e89bbef6d687580b430e6c9fb16384a05d030f70ebd8ddcb750f234fa7c59939"} Dec 01 10:48:26 crc kubenswrapper[4761]: I1201 10:48:26.759875 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:26 crc kubenswrapper[4761]: I1201 10:48:26.784221 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" podStartSLOduration=1.5984790709999999 podStartE2EDuration="3.78419738s" podCreationTimestamp="2025-12-01 10:48:23 +0000 UTC" firstStartedPulling="2025-12-01 10:48:24.092928714 +0000 UTC m=+1043.396687338" lastFinishedPulling="2025-12-01 10:48:26.278647023 +0000 UTC m=+1045.582405647" observedRunningTime="2025-12-01 10:48:26.782848823 +0000 UTC m=+1046.086607487" watchObservedRunningTime="2025-12-01 10:48:26.78419738 +0000 UTC m=+1046.087956014" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.320389 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg"] Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.321980 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.325249 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.325739 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5xrmn" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.348539 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg"] Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.426935 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-apiservice-cert\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.427022 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-webhook-cert\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.427039 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfv5z\" (UniqueName: \"kubernetes.io/projected/1476658c-4234-4688-9c90-25ec6ba4a55d-kube-api-access-lfv5z\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.528827 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-apiservice-cert\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.528938 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-webhook-cert\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.528964 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfv5z\" (UniqueName: \"kubernetes.io/projected/1476658c-4234-4688-9c90-25ec6ba4a55d-kube-api-access-lfv5z\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.535832 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-apiservice-cert\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.539499 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-webhook-cert\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.546724 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfv5z\" (UniqueName: \"kubernetes.io/projected/1476658c-4234-4688-9c90-25ec6ba4a55d-kube-api-access-lfv5z\") pod \"swift-operator-controller-manager-547f66dd48-gbkdg\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.641393 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.802912 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68949bdcb7-pd467" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.850498 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.850580 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.850620 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.851367 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d30d5344481323b43a7d255c5c2b5f71119019ddc6b979360df65b87253e34d5"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.851420 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://d30d5344481323b43a7d255c5c2b5f71119019ddc6b979360df65b87253e34d5" gracePeriod=600 Dec 01 10:48:33 crc kubenswrapper[4761]: I1201 10:48:33.963112 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg"] Dec 01 10:48:34 crc kubenswrapper[4761]: I1201 10:48:34.820589 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" event={"ID":"1476658c-4234-4688-9c90-25ec6ba4a55d","Type":"ContainerStarted","Data":"0e1574aa22d11c87d6beb4e3941f239d5ebdee810a8649ac46a431924d2bafaa"} Dec 01 10:48:34 crc kubenswrapper[4761]: I1201 10:48:34.825759 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="d30d5344481323b43a7d255c5c2b5f71119019ddc6b979360df65b87253e34d5" exitCode=0 Dec 01 10:48:34 crc kubenswrapper[4761]: I1201 10:48:34.825794 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"d30d5344481323b43a7d255c5c2b5f71119019ddc6b979360df65b87253e34d5"} Dec 01 10:48:34 crc kubenswrapper[4761]: I1201 10:48:34.825834 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"ab11ccfedd2eeac2b7c4c9c4adbffd2e76c15b3f5230acf3c51b97fe7e1ab0cf"} Dec 01 10:48:34 crc kubenswrapper[4761]: I1201 10:48:34.825854 4761 scope.go:117] "RemoveContainer" containerID="3eb417125a9051f5c4c312a6fe5fbfd28525e926ddf81a026e3b1bb704152208" Dec 01 10:48:36 crc kubenswrapper[4761]: I1201 10:48:36.853185 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" event={"ID":"1476658c-4234-4688-9c90-25ec6ba4a55d","Type":"ContainerStarted","Data":"757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a"} Dec 01 10:48:36 crc kubenswrapper[4761]: I1201 10:48:36.853795 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:36 crc kubenswrapper[4761]: I1201 10:48:36.881609 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" podStartSLOduration=1.873247767 podStartE2EDuration="3.881573472s" podCreationTimestamp="2025-12-01 10:48:33 +0000 UTC" firstStartedPulling="2025-12-01 10:48:33.977510729 +0000 UTC m=+1053.281269353" lastFinishedPulling="2025-12-01 10:48:35.985836434 +0000 UTC m=+1055.289595058" observedRunningTime="2025-12-01 10:48:36.876454025 +0000 UTC m=+1056.180212649" watchObservedRunningTime="2025-12-01 10:48:36.881573472 +0000 UTC m=+1056.185332116" Dec 01 10:48:43 crc kubenswrapper[4761]: I1201 10:48:43.652636 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.741369 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.746284 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.748789 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.748791 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.749079 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-m29c5" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.749728 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.764497 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.944540 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.944617 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.944656 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-lock\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.944684 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-cache\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:47 crc kubenswrapper[4761]: I1201 10:48:47.944854 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fdrx\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-kube-api-access-4fdrx\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.046511 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.046641 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.046703 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-lock\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.046748 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-cache\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: E1201 10:48:48.046751 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:48:48 crc kubenswrapper[4761]: E1201 10:48:48.046791 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:48:48 crc kubenswrapper[4761]: E1201 10:48:48.046867 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift podName:20f34da4-e281-4e68-9a1f-02c97211a365 nodeName:}" failed. No retries permitted until 2025-12-01 10:48:48.546841029 +0000 UTC m=+1067.850599723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift") pod "swift-storage-0" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365") : configmap "swift-ring-files" not found Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.046900 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fdrx\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-kube-api-access-4fdrx\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.047041 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.047341 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-lock\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.047512 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-cache\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.083653 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.092358 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fdrx\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-kube-api-access-4fdrx\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.316532 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-rgk2z"] Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.317633 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.335747 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.336436 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.340510 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.351406 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-rgk2z"] Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.452082 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlcq2\" (UniqueName: \"kubernetes.io/projected/37a08ed0-59f3-4e0a-84ba-02a02a886e68-kube-api-access-vlcq2\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.452418 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-swiftconf\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.452566 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-dispersionconf\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.452837 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a08ed0-59f3-4e0a-84ba-02a02a886e68-etc-swift\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.452948 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-ring-data-devices\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.452983 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-scripts\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.553988 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-swiftconf\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.554042 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-dispersionconf\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.554083 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.554135 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a08ed0-59f3-4e0a-84ba-02a02a886e68-etc-swift\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.554181 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-ring-data-devices\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.554207 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-scripts\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.554369 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlcq2\" (UniqueName: \"kubernetes.io/projected/37a08ed0-59f3-4e0a-84ba-02a02a886e68-kube-api-access-vlcq2\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: E1201 10:48:48.554801 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:48:48 crc kubenswrapper[4761]: E1201 10:48:48.554835 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:48:48 crc kubenswrapper[4761]: E1201 10:48:48.554895 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift podName:20f34da4-e281-4e68-9a1f-02c97211a365 nodeName:}" failed. No retries permitted until 2025-12-01 10:48:49.554867572 +0000 UTC m=+1068.858626196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift") pod "swift-storage-0" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365") : configmap "swift-ring-files" not found Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.555359 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a08ed0-59f3-4e0a-84ba-02a02a886e68-etc-swift\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.556040 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-scripts\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.556129 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-ring-data-devices\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.560091 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-dispersionconf\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.561101 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-swiftconf\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.591161 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlcq2\" (UniqueName: \"kubernetes.io/projected/37a08ed0-59f3-4e0a-84ba-02a02a886e68-kube-api-access-vlcq2\") pod \"swift-ring-rebalance-rgk2z\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:48 crc kubenswrapper[4761]: I1201 10:48:48.635939 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.458660 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-zznx6"] Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.459915 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-zznx6" Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.466714 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-bqpps" Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.468892 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-zznx6"] Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.516896 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-rgk2z"] Dec 01 10:48:49 crc kubenswrapper[4761]: W1201 10:48:49.521102 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37a08ed0_59f3_4e0a_84ba_02a02a886e68.slice/crio-2984f725ab9358674212b3199703ce709d5cdb43f700132256cc1b8a0ea22ec3 WatchSource:0}: Error finding container 2984f725ab9358674212b3199703ce709d5cdb43f700132256cc1b8a0ea22ec3: Status 404 returned error can't find the container with id 2984f725ab9358674212b3199703ce709d5cdb43f700132256cc1b8a0ea22ec3 Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.538454 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl2zz\" (UniqueName: \"kubernetes.io/projected/659cff19-952c-4f37-ba5e-9026cedc1d62-kube-api-access-wl2zz\") pod \"glance-operator-index-zznx6\" (UID: \"659cff19-952c-4f37-ba5e-9026cedc1d62\") " pod="openstack-operators/glance-operator-index-zznx6" Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.640284 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:49 crc kubenswrapper[4761]: E1201 10:48:49.640486 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:48:49 crc kubenswrapper[4761]: E1201 10:48:49.640520 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.640499 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl2zz\" (UniqueName: \"kubernetes.io/projected/659cff19-952c-4f37-ba5e-9026cedc1d62-kube-api-access-wl2zz\") pod \"glance-operator-index-zznx6\" (UID: \"659cff19-952c-4f37-ba5e-9026cedc1d62\") " pod="openstack-operators/glance-operator-index-zznx6" Dec 01 10:48:49 crc kubenswrapper[4761]: E1201 10:48:49.640595 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift podName:20f34da4-e281-4e68-9a1f-02c97211a365 nodeName:}" failed. No retries permitted until 2025-12-01 10:48:51.640571795 +0000 UTC m=+1070.944330459 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift") pod "swift-storage-0" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365") : configmap "swift-ring-files" not found Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.664432 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl2zz\" (UniqueName: \"kubernetes.io/projected/659cff19-952c-4f37-ba5e-9026cedc1d62-kube-api-access-wl2zz\") pod \"glance-operator-index-zznx6\" (UID: \"659cff19-952c-4f37-ba5e-9026cedc1d62\") " pod="openstack-operators/glance-operator-index-zznx6" Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.781734 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-zznx6" Dec 01 10:48:49 crc kubenswrapper[4761]: I1201 10:48:49.952087 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" event={"ID":"37a08ed0-59f3-4e0a-84ba-02a02a886e68","Type":"ContainerStarted","Data":"2984f725ab9358674212b3199703ce709d5cdb43f700132256cc1b8a0ea22ec3"} Dec 01 10:48:50 crc kubenswrapper[4761]: I1201 10:48:50.243278 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-zznx6"] Dec 01 10:48:50 crc kubenswrapper[4761]: I1201 10:48:50.983317 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-zznx6" event={"ID":"659cff19-952c-4f37-ba5e-9026cedc1d62","Type":"ContainerStarted","Data":"4daac14268d54cbc985db885d958775f3f4c2aee277a77746f0e070352cc6855"} Dec 01 10:48:51 crc kubenswrapper[4761]: I1201 10:48:51.676638 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:51 crc kubenswrapper[4761]: E1201 10:48:51.676820 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:48:51 crc kubenswrapper[4761]: E1201 10:48:51.677067 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:48:51 crc kubenswrapper[4761]: E1201 10:48:51.677125 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift podName:20f34da4-e281-4e68-9a1f-02c97211a365 nodeName:}" failed. No retries permitted until 2025-12-01 10:48:55.677108698 +0000 UTC m=+1074.980867322 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift") pod "swift-storage-0" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365") : configmap "swift-ring-files" not found Dec 01 10:48:54 crc kubenswrapper[4761]: I1201 10:48:54.055862 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-zznx6"] Dec 01 10:48:54 crc kubenswrapper[4761]: I1201 10:48:54.668064 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-2d6pk"] Dec 01 10:48:54 crc kubenswrapper[4761]: I1201 10:48:54.673739 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:48:54 crc kubenswrapper[4761]: I1201 10:48:54.674204 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-2d6pk"] Dec 01 10:48:54 crc kubenswrapper[4761]: I1201 10:48:54.824825 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv25n\" (UniqueName: \"kubernetes.io/projected/9e884079-1d5d-40f2-a169-f2f0781bad65-kube-api-access-zv25n\") pod \"glance-operator-index-2d6pk\" (UID: \"9e884079-1d5d-40f2-a169-f2f0781bad65\") " pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:48:54 crc kubenswrapper[4761]: I1201 10:48:54.926513 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv25n\" (UniqueName: \"kubernetes.io/projected/9e884079-1d5d-40f2-a169-f2f0781bad65-kube-api-access-zv25n\") pod \"glance-operator-index-2d6pk\" (UID: \"9e884079-1d5d-40f2-a169-f2f0781bad65\") " pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:48:54 crc kubenswrapper[4761]: I1201 10:48:54.959849 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv25n\" (UniqueName: \"kubernetes.io/projected/9e884079-1d5d-40f2-a169-f2f0781bad65-kube-api-access-zv25n\") pod \"glance-operator-index-2d6pk\" (UID: \"9e884079-1d5d-40f2-a169-f2f0781bad65\") " pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:48:54 crc kubenswrapper[4761]: I1201 10:48:54.992793 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:48:55 crc kubenswrapper[4761]: I1201 10:48:55.720763 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:48:55 crc kubenswrapper[4761]: I1201 10:48:55.741295 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:48:55 crc kubenswrapper[4761]: E1201 10:48:55.741526 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:48:55 crc kubenswrapper[4761]: E1201 10:48:55.741578 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:48:55 crc kubenswrapper[4761]: E1201 10:48:55.743825 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift podName:20f34da4-e281-4e68-9a1f-02c97211a365 nodeName:}" failed. No retries permitted until 2025-12-01 10:49:03.743777944 +0000 UTC m=+1083.047536608 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift") pod "swift-storage-0" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365") : configmap "swift-ring-files" not found Dec 01 10:48:57 crc kubenswrapper[4761]: I1201 10:48:57.796832 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-2d6pk"] Dec 01 10:48:57 crc kubenswrapper[4761]: W1201 10:48:57.800192 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e884079_1d5d_40f2_a169_f2f0781bad65.slice/crio-a8e520d3d970c6cb838982a4bba44afd7a948e5c3a681be5e97842a2fab2ef18 WatchSource:0}: Error finding container a8e520d3d970c6cb838982a4bba44afd7a948e5c3a681be5e97842a2fab2ef18: Status 404 returned error can't find the container with id a8e520d3d970c6cb838982a4bba44afd7a948e5c3a681be5e97842a2fab2ef18 Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.035920 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-2d6pk" event={"ID":"9e884079-1d5d-40f2-a169-f2f0781bad65","Type":"ContainerStarted","Data":"00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8"} Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.035961 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-2d6pk" event={"ID":"9e884079-1d5d-40f2-a169-f2f0781bad65","Type":"ContainerStarted","Data":"a8e520d3d970c6cb838982a4bba44afd7a948e5c3a681be5e97842a2fab2ef18"} Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.037989 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-zznx6" event={"ID":"659cff19-952c-4f37-ba5e-9026cedc1d62","Type":"ContainerStarted","Data":"e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c"} Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.038078 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-zznx6" podUID="659cff19-952c-4f37-ba5e-9026cedc1d62" containerName="registry-server" containerID="cri-o://e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c" gracePeriod=2 Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.039684 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" event={"ID":"37a08ed0-59f3-4e0a-84ba-02a02a886e68","Type":"ContainerStarted","Data":"02e78242e3db8585218da5a7e36f422dd1b206345fc629c97582aa60fce53a6b"} Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.074654 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-2d6pk" podStartSLOduration=4.021520749 podStartE2EDuration="4.074637949s" podCreationTimestamp="2025-12-01 10:48:54 +0000 UTC" firstStartedPulling="2025-12-01 10:48:57.804898309 +0000 UTC m=+1077.108656933" lastFinishedPulling="2025-12-01 10:48:57.858015509 +0000 UTC m=+1077.161774133" observedRunningTime="2025-12-01 10:48:58.059540183 +0000 UTC m=+1077.363298807" watchObservedRunningTime="2025-12-01 10:48:58.074637949 +0000 UTC m=+1077.378396573" Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.075352 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" podStartSLOduration=2.199295682 podStartE2EDuration="10.075348038s" podCreationTimestamp="2025-12-01 10:48:48 +0000 UTC" firstStartedPulling="2025-12-01 10:48:49.522932039 +0000 UTC m=+1068.826690663" lastFinishedPulling="2025-12-01 10:48:57.398984405 +0000 UTC m=+1076.702743019" observedRunningTime="2025-12-01 10:48:58.074853425 +0000 UTC m=+1077.378612059" watchObservedRunningTime="2025-12-01 10:48:58.075348038 +0000 UTC m=+1077.379106662" Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.098422 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-zznx6" podStartSLOduration=2.02138343 podStartE2EDuration="9.098397569s" podCreationTimestamp="2025-12-01 10:48:49 +0000 UTC" firstStartedPulling="2025-12-01 10:48:50.253706898 +0000 UTC m=+1069.557465512" lastFinishedPulling="2025-12-01 10:48:57.330720987 +0000 UTC m=+1076.634479651" observedRunningTime="2025-12-01 10:48:58.09732772 +0000 UTC m=+1077.401086364" watchObservedRunningTime="2025-12-01 10:48:58.098397569 +0000 UTC m=+1077.402156193" Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.434854 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-zznx6" Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.489190 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl2zz\" (UniqueName: \"kubernetes.io/projected/659cff19-952c-4f37-ba5e-9026cedc1d62-kube-api-access-wl2zz\") pod \"659cff19-952c-4f37-ba5e-9026cedc1d62\" (UID: \"659cff19-952c-4f37-ba5e-9026cedc1d62\") " Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.507768 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659cff19-952c-4f37-ba5e-9026cedc1d62-kube-api-access-wl2zz" (OuterVolumeSpecName: "kube-api-access-wl2zz") pod "659cff19-952c-4f37-ba5e-9026cedc1d62" (UID: "659cff19-952c-4f37-ba5e-9026cedc1d62"). InnerVolumeSpecName "kube-api-access-wl2zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:58 crc kubenswrapper[4761]: I1201 10:48:58.591270 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl2zz\" (UniqueName: \"kubernetes.io/projected/659cff19-952c-4f37-ba5e-9026cedc1d62-kube-api-access-wl2zz\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.049624 4761 generic.go:334] "Generic (PLEG): container finished" podID="659cff19-952c-4f37-ba5e-9026cedc1d62" containerID="e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c" exitCode=0 Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.049699 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-zznx6" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.049694 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-zznx6" event={"ID":"659cff19-952c-4f37-ba5e-9026cedc1d62","Type":"ContainerDied","Data":"e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c"} Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.049761 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-zznx6" event={"ID":"659cff19-952c-4f37-ba5e-9026cedc1d62","Type":"ContainerDied","Data":"4daac14268d54cbc985db885d958775f3f4c2aee277a77746f0e070352cc6855"} Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.049791 4761 scope.go:117] "RemoveContainer" containerID="e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.069312 4761 scope.go:117] "RemoveContainer" containerID="e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c" Dec 01 10:48:59 crc kubenswrapper[4761]: E1201 10:48:59.069843 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c\": container with ID starting with e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c not found: ID does not exist" containerID="e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.069899 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c"} err="failed to get container status \"e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c\": rpc error: code = NotFound desc = could not find container \"e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c\": container with ID starting with e7b71c682ecc165ad4266383f7ea51aa80d26bd42aa943a2d57fa3b55a2a480c not found: ID does not exist" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.082716 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-zznx6"] Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.090478 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-zznx6"] Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.151909 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659cff19-952c-4f37-ba5e-9026cedc1d62" path="/var/lib/kubelet/pods/659cff19-952c-4f37-ba5e-9026cedc1d62/volumes" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.489248 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb"] Dec 01 10:48:59 crc kubenswrapper[4761]: E1201 10:48:59.489569 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659cff19-952c-4f37-ba5e-9026cedc1d62" containerName="registry-server" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.489589 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="659cff19-952c-4f37-ba5e-9026cedc1d62" containerName="registry-server" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.489756 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="659cff19-952c-4f37-ba5e-9026cedc1d62" containerName="registry-server" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.490627 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.509492 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb"] Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.607714 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.607795 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjnjt\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-kube-api-access-xjnjt\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.607843 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-run-httpd\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.607861 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-log-httpd\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.607887 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e546fe9d-d4e0-475b-a1c5-034b718ea4de-config-data\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.709634 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.709704 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjnjt\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-kube-api-access-xjnjt\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.709759 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-run-httpd\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.709784 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-log-httpd\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.709823 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e546fe9d-d4e0-475b-a1c5-034b718ea4de-config-data\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: E1201 10:48:59.709885 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:48:59 crc kubenswrapper[4761]: E1201 10:48:59.709920 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb: configmap "swift-ring-files" not found Dec 01 10:48:59 crc kubenswrapper[4761]: E1201 10:48:59.709987 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift podName:e546fe9d-d4e0-475b-a1c5-034b718ea4de nodeName:}" failed. No retries permitted until 2025-12-01 10:49:00.209962165 +0000 UTC m=+1079.513720879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift") pod "swift-proxy-6bd58cfcf7-cq9vb" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de") : configmap "swift-ring-files" not found Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.710642 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-log-httpd\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.710784 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-run-httpd\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.716363 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e546fe9d-d4e0-475b-a1c5-034b718ea4de-config-data\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:48:59 crc kubenswrapper[4761]: I1201 10:48:59.740624 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjnjt\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-kube-api-access-xjnjt\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:00 crc kubenswrapper[4761]: I1201 10:49:00.217962 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:00 crc kubenswrapper[4761]: E1201 10:49:00.218845 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:49:00 crc kubenswrapper[4761]: E1201 10:49:00.218864 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb: configmap "swift-ring-files" not found Dec 01 10:49:00 crc kubenswrapper[4761]: E1201 10:49:00.218899 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift podName:e546fe9d-d4e0-475b-a1c5-034b718ea4de nodeName:}" failed. No retries permitted until 2025-12-01 10:49:01.218886583 +0000 UTC m=+1080.522645207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift") pod "swift-proxy-6bd58cfcf7-cq9vb" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de") : configmap "swift-ring-files" not found Dec 01 10:49:01 crc kubenswrapper[4761]: I1201 10:49:01.230305 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:01 crc kubenswrapper[4761]: E1201 10:49:01.231302 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:49:01 crc kubenswrapper[4761]: E1201 10:49:01.231329 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb: configmap "swift-ring-files" not found Dec 01 10:49:01 crc kubenswrapper[4761]: E1201 10:49:01.231375 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift podName:e546fe9d-d4e0-475b-a1c5-034b718ea4de nodeName:}" failed. No retries permitted until 2025-12-01 10:49:03.231357223 +0000 UTC m=+1082.535115867 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift") pod "swift-proxy-6bd58cfcf7-cq9vb" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de") : configmap "swift-ring-files" not found Dec 01 10:49:03 crc kubenswrapper[4761]: I1201 10:49:03.276875 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:03 crc kubenswrapper[4761]: E1201 10:49:03.277134 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:49:03 crc kubenswrapper[4761]: E1201 10:49:03.277287 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb: configmap "swift-ring-files" not found Dec 01 10:49:03 crc kubenswrapper[4761]: E1201 10:49:03.277342 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift podName:e546fe9d-d4e0-475b-a1c5-034b718ea4de nodeName:}" failed. No retries permitted until 2025-12-01 10:49:07.277326832 +0000 UTC m=+1086.581085456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift") pod "swift-proxy-6bd58cfcf7-cq9vb" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de") : configmap "swift-ring-files" not found Dec 01 10:49:03 crc kubenswrapper[4761]: I1201 10:49:03.783617 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:49:03 crc kubenswrapper[4761]: E1201 10:49:03.784105 4761 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:49:03 crc kubenswrapper[4761]: E1201 10:49:03.784156 4761 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:49:03 crc kubenswrapper[4761]: E1201 10:49:03.784236 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift podName:20f34da4-e281-4e68-9a1f-02c97211a365 nodeName:}" failed. No retries permitted until 2025-12-01 10:49:19.784209355 +0000 UTC m=+1099.087968019 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift") pod "swift-storage-0" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365") : configmap "swift-ring-files" not found Dec 01 10:49:04 crc kubenswrapper[4761]: I1201 10:49:04.993176 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:49:04 crc kubenswrapper[4761]: I1201 10:49:04.993869 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:49:05 crc kubenswrapper[4761]: I1201 10:49:05.040477 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:49:05 crc kubenswrapper[4761]: I1201 10:49:05.100067 4761 generic.go:334] "Generic (PLEG): container finished" podID="37a08ed0-59f3-4e0a-84ba-02a02a886e68" containerID="02e78242e3db8585218da5a7e36f422dd1b206345fc629c97582aa60fce53a6b" exitCode=0 Dec 01 10:49:05 crc kubenswrapper[4761]: I1201 10:49:05.100209 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" event={"ID":"37a08ed0-59f3-4e0a-84ba-02a02a886e68","Type":"ContainerDied","Data":"02e78242e3db8585218da5a7e36f422dd1b206345fc629c97582aa60fce53a6b"} Dec 01 10:49:05 crc kubenswrapper[4761]: I1201 10:49:05.140630 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.440630 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.526884 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlcq2\" (UniqueName: \"kubernetes.io/projected/37a08ed0-59f3-4e0a-84ba-02a02a886e68-kube-api-access-vlcq2\") pod \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.526982 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-scripts\") pod \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.527074 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a08ed0-59f3-4e0a-84ba-02a02a886e68-etc-swift\") pod \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.527138 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-dispersionconf\") pod \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.527216 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-ring-data-devices\") pod \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.527256 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-swiftconf\") pod \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\" (UID: \"37a08ed0-59f3-4e0a-84ba-02a02a886e68\") " Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.528497 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "37a08ed0-59f3-4e0a-84ba-02a02a886e68" (UID: "37a08ed0-59f3-4e0a-84ba-02a02a886e68"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.528848 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a08ed0-59f3-4e0a-84ba-02a02a886e68-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "37a08ed0-59f3-4e0a-84ba-02a02a886e68" (UID: "37a08ed0-59f3-4e0a-84ba-02a02a886e68"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.534750 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a08ed0-59f3-4e0a-84ba-02a02a886e68-kube-api-access-vlcq2" (OuterVolumeSpecName: "kube-api-access-vlcq2") pod "37a08ed0-59f3-4e0a-84ba-02a02a886e68" (UID: "37a08ed0-59f3-4e0a-84ba-02a02a886e68"). InnerVolumeSpecName "kube-api-access-vlcq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.534796 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "37a08ed0-59f3-4e0a-84ba-02a02a886e68" (UID: "37a08ed0-59f3-4e0a-84ba-02a02a886e68"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.546736 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "37a08ed0-59f3-4e0a-84ba-02a02a886e68" (UID: "37a08ed0-59f3-4e0a-84ba-02a02a886e68"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.546820 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-scripts" (OuterVolumeSpecName: "scripts") pod "37a08ed0-59f3-4e0a-84ba-02a02a886e68" (UID: "37a08ed0-59f3-4e0a-84ba-02a02a886e68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.629075 4761 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a08ed0-59f3-4e0a-84ba-02a02a886e68-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.629108 4761 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.629119 4761 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.629128 4761 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a08ed0-59f3-4e0a-84ba-02a02a886e68-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.629136 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlcq2\" (UniqueName: \"kubernetes.io/projected/37a08ed0-59f3-4e0a-84ba-02a02a886e68-kube-api-access-vlcq2\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:06 crc kubenswrapper[4761]: I1201 10:49:06.629144 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a08ed0-59f3-4e0a-84ba-02a02a886e68-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.131108 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.140962 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-rgk2z" event={"ID":"37a08ed0-59f3-4e0a-84ba-02a02a886e68","Type":"ContainerDied","Data":"2984f725ab9358674212b3199703ce709d5cdb43f700132256cc1b8a0ea22ec3"} Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.141218 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2984f725ab9358674212b3199703ce709d5cdb43f700132256cc1b8a0ea22ec3" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.339480 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.348291 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") pod \"swift-proxy-6bd58cfcf7-cq9vb\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.613491 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.729591 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb"] Dec 01 10:49:07 crc kubenswrapper[4761]: E1201 10:49:07.730641 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a08ed0-59f3-4e0a-84ba-02a02a886e68" containerName="swift-ring-rebalance" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.730753 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a08ed0-59f3-4e0a-84ba-02a02a886e68" containerName="swift-ring-rebalance" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.731000 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a08ed0-59f3-4e0a-84ba-02a02a886e68" containerName="swift-ring-rebalance" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.732255 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.740967 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb"] Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.745755 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8w9gk" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.846598 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-bundle\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.846654 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-util\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.846701 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qwpb\" (UniqueName: \"kubernetes.io/projected/728cc888-0261-42fc-93da-a9f5ddd03382-kube-api-access-5qwpb\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.948039 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-bundle\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.948089 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-util\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.948136 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qwpb\" (UniqueName: \"kubernetes.io/projected/728cc888-0261-42fc-93da-a9f5ddd03382-kube-api-access-5qwpb\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.948566 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-bundle\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.948604 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-util\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:07 crc kubenswrapper[4761]: I1201 10:49:07.975301 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qwpb\" (UniqueName: \"kubernetes.io/projected/728cc888-0261-42fc-93da-a9f5ddd03382-kube-api-access-5qwpb\") pod \"252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:08 crc kubenswrapper[4761]: I1201 10:49:08.059232 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:08 crc kubenswrapper[4761]: I1201 10:49:08.131221 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb"] Dec 01 10:49:08 crc kubenswrapper[4761]: W1201 10:49:08.154795 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode546fe9d_d4e0_475b_a1c5_034b718ea4de.slice/crio-3d056e2cb6003065446142a870491fe25a3a47d8a64b54643ad71b2b740ae7eb WatchSource:0}: Error finding container 3d056e2cb6003065446142a870491fe25a3a47d8a64b54643ad71b2b740ae7eb: Status 404 returned error can't find the container with id 3d056e2cb6003065446142a870491fe25a3a47d8a64b54643ad71b2b740ae7eb Dec 01 10:49:08 crc kubenswrapper[4761]: W1201 10:49:08.477756 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod728cc888_0261_42fc_93da_a9f5ddd03382.slice/crio-f9ad421196224e578e09a5029ed804fbaa944ee50d61ae4b35c776e32dbebc28 WatchSource:0}: Error finding container f9ad421196224e578e09a5029ed804fbaa944ee50d61ae4b35c776e32dbebc28: Status 404 returned error can't find the container with id f9ad421196224e578e09a5029ed804fbaa944ee50d61ae4b35c776e32dbebc28 Dec 01 10:49:08 crc kubenswrapper[4761]: I1201 10:49:08.478167 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb"] Dec 01 10:49:09 crc kubenswrapper[4761]: I1201 10:49:09.156208 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" event={"ID":"e546fe9d-d4e0-475b-a1c5-034b718ea4de","Type":"ContainerStarted","Data":"a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf"} Dec 01 10:49:09 crc kubenswrapper[4761]: I1201 10:49:09.156257 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" event={"ID":"e546fe9d-d4e0-475b-a1c5-034b718ea4de","Type":"ContainerStarted","Data":"18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1"} Dec 01 10:49:09 crc kubenswrapper[4761]: I1201 10:49:09.156273 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" event={"ID":"e546fe9d-d4e0-475b-a1c5-034b718ea4de","Type":"ContainerStarted","Data":"3d056e2cb6003065446142a870491fe25a3a47d8a64b54643ad71b2b740ae7eb"} Dec 01 10:49:09 crc kubenswrapper[4761]: I1201 10:49:09.156451 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:09 crc kubenswrapper[4761]: I1201 10:49:09.158281 4761 generic.go:334] "Generic (PLEG): container finished" podID="728cc888-0261-42fc-93da-a9f5ddd03382" containerID="e8bb3ec6cdf14940b9b45e0dc53f2cd8adc1388a4630439431039fbfdd12d7c3" exitCode=0 Dec 01 10:49:09 crc kubenswrapper[4761]: I1201 10:49:09.158343 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" event={"ID":"728cc888-0261-42fc-93da-a9f5ddd03382","Type":"ContainerDied","Data":"e8bb3ec6cdf14940b9b45e0dc53f2cd8adc1388a4630439431039fbfdd12d7c3"} Dec 01 10:49:09 crc kubenswrapper[4761]: I1201 10:49:09.158380 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" event={"ID":"728cc888-0261-42fc-93da-a9f5ddd03382","Type":"ContainerStarted","Data":"f9ad421196224e578e09a5029ed804fbaa944ee50d61ae4b35c776e32dbebc28"} Dec 01 10:49:09 crc kubenswrapper[4761]: I1201 10:49:09.182060 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" podStartSLOduration=10.182037008 podStartE2EDuration="10.182037008s" podCreationTimestamp="2025-12-01 10:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:49:09.177377283 +0000 UTC m=+1088.481135917" watchObservedRunningTime="2025-12-01 10:49:09.182037008 +0000 UTC m=+1088.485795632" Dec 01 10:49:10 crc kubenswrapper[4761]: I1201 10:49:10.169772 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" event={"ID":"728cc888-0261-42fc-93da-a9f5ddd03382","Type":"ContainerStarted","Data":"3775b7f5b5da5a4e95f3199cc37879688beb36c61d95e7ab45874b69e3505484"} Dec 01 10:49:10 crc kubenswrapper[4761]: I1201 10:49:10.170119 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:11 crc kubenswrapper[4761]: I1201 10:49:11.184061 4761 generic.go:334] "Generic (PLEG): container finished" podID="728cc888-0261-42fc-93da-a9f5ddd03382" containerID="3775b7f5b5da5a4e95f3199cc37879688beb36c61d95e7ab45874b69e3505484" exitCode=0 Dec 01 10:49:11 crc kubenswrapper[4761]: I1201 10:49:11.184157 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" event={"ID":"728cc888-0261-42fc-93da-a9f5ddd03382","Type":"ContainerDied","Data":"3775b7f5b5da5a4e95f3199cc37879688beb36c61d95e7ab45874b69e3505484"} Dec 01 10:49:12 crc kubenswrapper[4761]: I1201 10:49:12.200429 4761 generic.go:334] "Generic (PLEG): container finished" podID="728cc888-0261-42fc-93da-a9f5ddd03382" containerID="14d87bcaf79fd9b037204d8c67015069d18134a279a232db8b249f038dcfe77a" exitCode=0 Dec 01 10:49:12 crc kubenswrapper[4761]: I1201 10:49:12.200515 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" event={"ID":"728cc888-0261-42fc-93da-a9f5ddd03382","Type":"ContainerDied","Data":"14d87bcaf79fd9b037204d8c67015069d18134a279a232db8b249f038dcfe77a"} Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.607000 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.794008 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-util\") pod \"728cc888-0261-42fc-93da-a9f5ddd03382\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.794095 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qwpb\" (UniqueName: \"kubernetes.io/projected/728cc888-0261-42fc-93da-a9f5ddd03382-kube-api-access-5qwpb\") pod \"728cc888-0261-42fc-93da-a9f5ddd03382\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.794181 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-bundle\") pod \"728cc888-0261-42fc-93da-a9f5ddd03382\" (UID: \"728cc888-0261-42fc-93da-a9f5ddd03382\") " Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.795468 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-bundle" (OuterVolumeSpecName: "bundle") pod "728cc888-0261-42fc-93da-a9f5ddd03382" (UID: "728cc888-0261-42fc-93da-a9f5ddd03382"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.802450 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728cc888-0261-42fc-93da-a9f5ddd03382-kube-api-access-5qwpb" (OuterVolumeSpecName: "kube-api-access-5qwpb") pod "728cc888-0261-42fc-93da-a9f5ddd03382" (UID: "728cc888-0261-42fc-93da-a9f5ddd03382"). InnerVolumeSpecName "kube-api-access-5qwpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.808262 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-util" (OuterVolumeSpecName: "util") pod "728cc888-0261-42fc-93da-a9f5ddd03382" (UID: "728cc888-0261-42fc-93da-a9f5ddd03382"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.895743 4761 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.895777 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qwpb\" (UniqueName: \"kubernetes.io/projected/728cc888-0261-42fc-93da-a9f5ddd03382-kube-api-access-5qwpb\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:13 crc kubenswrapper[4761]: I1201 10:49:13.895795 4761 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/728cc888-0261-42fc-93da-a9f5ddd03382-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:14 crc kubenswrapper[4761]: I1201 10:49:14.223634 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" event={"ID":"728cc888-0261-42fc-93da-a9f5ddd03382","Type":"ContainerDied","Data":"f9ad421196224e578e09a5029ed804fbaa944ee50d61ae4b35c776e32dbebc28"} Dec 01 10:49:14 crc kubenswrapper[4761]: I1201 10:49:14.224030 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9ad421196224e578e09a5029ed804fbaa944ee50d61ae4b35c776e32dbebc28" Dec 01 10:49:14 crc kubenswrapper[4761]: I1201 10:49:14.223805 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb" Dec 01 10:49:17 crc kubenswrapper[4761]: I1201 10:49:17.617169 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:17 crc kubenswrapper[4761]: I1201 10:49:17.619490 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:49:19 crc kubenswrapper[4761]: I1201 10:49:19.798461 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:49:19 crc kubenswrapper[4761]: I1201 10:49:19.805347 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"swift-storage-0\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:49:19 crc kubenswrapper[4761]: I1201 10:49:19.870305 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:49:20 crc kubenswrapper[4761]: I1201 10:49:20.324058 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 01 10:49:20 crc kubenswrapper[4761]: W1201 10:49:20.325672 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-9816df5aba32f3957e11a874249ec2b76f4fe6b99bec96944b56afc83080b688 WatchSource:0}: Error finding container 9816df5aba32f3957e11a874249ec2b76f4fe6b99bec96944b56afc83080b688: Status 404 returned error can't find the container with id 9816df5aba32f3957e11a874249ec2b76f4fe6b99bec96944b56afc83080b688 Dec 01 10:49:21 crc kubenswrapper[4761]: I1201 10:49:21.281457 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"9816df5aba32f3957e11a874249ec2b76f4fe6b99bec96944b56afc83080b688"} Dec 01 10:49:22 crc kubenswrapper[4761]: I1201 10:49:22.290900 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa"} Dec 01 10:49:22 crc kubenswrapper[4761]: I1201 10:49:22.291146 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a"} Dec 01 10:49:23 crc kubenswrapper[4761]: I1201 10:49:23.301405 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479"} Dec 01 10:49:23 crc kubenswrapper[4761]: I1201 10:49:23.302333 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1"} Dec 01 10:49:24 crc kubenswrapper[4761]: I1201 10:49:24.349736 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75"} Dec 01 10:49:24 crc kubenswrapper[4761]: I1201 10:49:24.350083 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b"} Dec 01 10:49:24 crc kubenswrapper[4761]: I1201 10:49:24.350099 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835"} Dec 01 10:49:25 crc kubenswrapper[4761]: I1201 10:49:25.367297 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187"} Dec 01 10:49:26 crc kubenswrapper[4761]: I1201 10:49:26.381158 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a"} Dec 01 10:49:26 crc kubenswrapper[4761]: I1201 10:49:26.381714 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b"} Dec 01 10:49:26 crc kubenswrapper[4761]: I1201 10:49:26.381726 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09"} Dec 01 10:49:26 crc kubenswrapper[4761]: I1201 10:49:26.381736 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577"} Dec 01 10:49:26 crc kubenswrapper[4761]: I1201 10:49:26.381747 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f"} Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.416578 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341"} Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.540611 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md"] Dec 01 10:49:27 crc kubenswrapper[4761]: E1201 10:49:27.540882 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728cc888-0261-42fc-93da-a9f5ddd03382" containerName="util" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.540912 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="728cc888-0261-42fc-93da-a9f5ddd03382" containerName="util" Dec 01 10:49:27 crc kubenswrapper[4761]: E1201 10:49:27.540931 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728cc888-0261-42fc-93da-a9f5ddd03382" containerName="pull" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.540938 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="728cc888-0261-42fc-93da-a9f5ddd03382" containerName="pull" Dec 01 10:49:27 crc kubenswrapper[4761]: E1201 10:49:27.540948 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728cc888-0261-42fc-93da-a9f5ddd03382" containerName="extract" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.540954 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="728cc888-0261-42fc-93da-a9f5ddd03382" containerName="extract" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.541112 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="728cc888-0261-42fc-93da-a9f5ddd03382" containerName="extract" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.541620 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.543151 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.544576 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4qdrt" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.551160 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md"] Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.718160 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-apiservice-cert\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.718232 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzst\" (UniqueName: \"kubernetes.io/projected/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-kube-api-access-qkzst\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.718525 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-webhook-cert\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.820309 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-apiservice-cert\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.820383 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzst\" (UniqueName: \"kubernetes.io/projected/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-kube-api-access-qkzst\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.820575 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-webhook-cert\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.825904 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-apiservice-cert\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.827260 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-webhook-cert\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.847666 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzst\" (UniqueName: \"kubernetes.io/projected/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-kube-api-access-qkzst\") pod \"glance-operator-controller-manager-7958ffffd8-wm6md\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:27 crc kubenswrapper[4761]: I1201 10:49:27.861869 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:28 crc kubenswrapper[4761]: I1201 10:49:28.298647 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md"] Dec 01 10:49:28 crc kubenswrapper[4761]: I1201 10:49:28.424006 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" event={"ID":"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014","Type":"ContainerStarted","Data":"53d1917a6265c38a69b8117b30fca0de48e6f6a8c862d239632e55ad16e14af0"} Dec 01 10:49:29 crc kubenswrapper[4761]: I1201 10:49:29.435935 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerStarted","Data":"3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e"} Dec 01 10:49:29 crc kubenswrapper[4761]: I1201 10:49:29.479854 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=38.310005703 podStartE2EDuration="43.479837638s" podCreationTimestamp="2025-12-01 10:48:46 +0000 UTC" firstStartedPulling="2025-12-01 10:49:20.328923199 +0000 UTC m=+1099.632681823" lastFinishedPulling="2025-12-01 10:49:25.498755144 +0000 UTC m=+1104.802513758" observedRunningTime="2025-12-01 10:49:29.475531379 +0000 UTC m=+1108.779290023" watchObservedRunningTime="2025-12-01 10:49:29.479837638 +0000 UTC m=+1108.783596262" Dec 01 10:49:30 crc kubenswrapper[4761]: I1201 10:49:30.446864 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" event={"ID":"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014","Type":"ContainerStarted","Data":"9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8"} Dec 01 10:49:30 crc kubenswrapper[4761]: I1201 10:49:30.474915 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" podStartSLOduration=1.748460699 podStartE2EDuration="3.474889677s" podCreationTimestamp="2025-12-01 10:49:27 +0000 UTC" firstStartedPulling="2025-12-01 10:49:28.295935856 +0000 UTC m=+1107.599694500" lastFinishedPulling="2025-12-01 10:49:30.022364854 +0000 UTC m=+1109.326123478" observedRunningTime="2025-12-01 10:49:30.465631044 +0000 UTC m=+1109.769389698" watchObservedRunningTime="2025-12-01 10:49:30.474889677 +0000 UTC m=+1109.778648331" Dec 01 10:49:31 crc kubenswrapper[4761]: I1201 10:49:31.452445 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:37 crc kubenswrapper[4761]: I1201 10:49:37.869099 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.843348 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx"] Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.844977 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.849392 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.850583 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-7bztp"] Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.851947 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.858817 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx"] Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.861509 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwhh\" (UniqueName: \"kubernetes.io/projected/1e1ab633-481e-4dfa-8e73-a179d903bcbd-kube-api-access-7rwhh\") pod \"glance-db-create-7bztp\" (UID: \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\") " pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.861663 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97cfb610-4283-487f-b8be-561479406c65-operator-scripts\") pod \"glance-2b4d-account-create-update-x6kjx\" (UID: \"97cfb610-4283-487f-b8be-561479406c65\") " pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.861728 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9xf4\" (UniqueName: \"kubernetes.io/projected/97cfb610-4283-487f-b8be-561479406c65-kube-api-access-t9xf4\") pod \"glance-2b4d-account-create-update-x6kjx\" (UID: \"97cfb610-4283-487f-b8be-561479406c65\") " pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.861786 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e1ab633-481e-4dfa-8e73-a179d903bcbd-operator-scripts\") pod \"glance-db-create-7bztp\" (UID: \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\") " pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.866333 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-7bztp"] Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.906703 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.907618 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.911783 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.911877 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.912191 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.912352 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-4g9l2" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.913990 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.962456 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwhh\" (UniqueName: \"kubernetes.io/projected/1e1ab633-481e-4dfa-8e73-a179d903bcbd-kube-api-access-7rwhh\") pod \"glance-db-create-7bztp\" (UID: \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\") " pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.962503 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97cfb610-4283-487f-b8be-561479406c65-operator-scripts\") pod \"glance-2b4d-account-create-update-x6kjx\" (UID: \"97cfb610-4283-487f-b8be-561479406c65\") " pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.962523 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0591941-4952-4c56-868f-e1bae8575651-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.962542 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-scripts\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.962613 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n62hc\" (UniqueName: \"kubernetes.io/projected/a0591941-4952-4c56-868f-e1bae8575651-kube-api-access-n62hc\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.962649 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9xf4\" (UniqueName: \"kubernetes.io/projected/97cfb610-4283-487f-b8be-561479406c65-kube-api-access-t9xf4\") pod \"glance-2b4d-account-create-update-x6kjx\" (UID: \"97cfb610-4283-487f-b8be-561479406c65\") " pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.962679 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-config\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.962712 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e1ab633-481e-4dfa-8e73-a179d903bcbd-operator-scripts\") pod \"glance-db-create-7bztp\" (UID: \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\") " pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.963382 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e1ab633-481e-4dfa-8e73-a179d903bcbd-operator-scripts\") pod \"glance-db-create-7bztp\" (UID: \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\") " pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.963398 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97cfb610-4283-487f-b8be-561479406c65-operator-scripts\") pod \"glance-2b4d-account-create-update-x6kjx\" (UID: \"97cfb610-4283-487f-b8be-561479406c65\") " pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.984214 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwhh\" (UniqueName: \"kubernetes.io/projected/1e1ab633-481e-4dfa-8e73-a179d903bcbd-kube-api-access-7rwhh\") pod \"glance-db-create-7bztp\" (UID: \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\") " pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:42 crc kubenswrapper[4761]: I1201 10:49:42.986138 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9xf4\" (UniqueName: \"kubernetes.io/projected/97cfb610-4283-487f-b8be-561479406c65-kube-api-access-t9xf4\") pod \"glance-2b4d-account-create-update-x6kjx\" (UID: \"97cfb610-4283-487f-b8be-561479406c65\") " pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.063526 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n62hc\" (UniqueName: \"kubernetes.io/projected/a0591941-4952-4c56-868f-e1bae8575651-kube-api-access-n62hc\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.063598 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-config\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.063660 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0591941-4952-4c56-868f-e1bae8575651-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.063678 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-scripts\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.064510 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-config\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.064589 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-scripts\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.067484 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0591941-4952-4c56-868f-e1bae8575651-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.077543 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n62hc\" (UniqueName: \"kubernetes.io/projected/a0591941-4952-4c56-868f-e1bae8575651-kube-api-access-n62hc\") pod \"openstackclient\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.176624 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.193406 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.227970 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.616234 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx"] Dec 01 10:49:43 crc kubenswrapper[4761]: W1201 10:49:43.619645 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97cfb610_4283_487f_b8be_561479406c65.slice/crio-b2626e4ad08697d8bb6faa5d462faa885214dc1af1539532d51d3d6ac7f37504 WatchSource:0}: Error finding container b2626e4ad08697d8bb6faa5d462faa885214dc1af1539532d51d3d6ac7f37504: Status 404 returned error can't find the container with id b2626e4ad08697d8bb6faa5d462faa885214dc1af1539532d51d3d6ac7f37504 Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.692809 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-7bztp"] Dec 01 10:49:43 crc kubenswrapper[4761]: I1201 10:49:43.926127 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:49:44 crc kubenswrapper[4761]: I1201 10:49:44.562360 4761 generic.go:334] "Generic (PLEG): container finished" podID="1e1ab633-481e-4dfa-8e73-a179d903bcbd" containerID="ce3a1f2e906938bf994dcac8838933c8d5e8160fd5006a25e6c8703edcd086c8" exitCode=0 Dec 01 10:49:44 crc kubenswrapper[4761]: I1201 10:49:44.562494 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-7bztp" event={"ID":"1e1ab633-481e-4dfa-8e73-a179d903bcbd","Type":"ContainerDied","Data":"ce3a1f2e906938bf994dcac8838933c8d5e8160fd5006a25e6c8703edcd086c8"} Dec 01 10:49:44 crc kubenswrapper[4761]: I1201 10:49:44.562955 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-7bztp" event={"ID":"1e1ab633-481e-4dfa-8e73-a179d903bcbd","Type":"ContainerStarted","Data":"0e4d1cbaffd6861cf13b69c924c351ba7232269a57ee75a9aa9c89dfacded8b0"} Dec 01 10:49:44 crc kubenswrapper[4761]: I1201 10:49:44.564919 4761 generic.go:334] "Generic (PLEG): container finished" podID="97cfb610-4283-487f-b8be-561479406c65" containerID="25f8f90637e9fc153c00c941ce83f59a23865f5a6e37ab2bd9a74ccac2671644" exitCode=0 Dec 01 10:49:44 crc kubenswrapper[4761]: I1201 10:49:44.565032 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" event={"ID":"97cfb610-4283-487f-b8be-561479406c65","Type":"ContainerDied","Data":"25f8f90637e9fc153c00c941ce83f59a23865f5a6e37ab2bd9a74ccac2671644"} Dec 01 10:49:44 crc kubenswrapper[4761]: I1201 10:49:44.565111 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" event={"ID":"97cfb610-4283-487f-b8be-561479406c65","Type":"ContainerStarted","Data":"b2626e4ad08697d8bb6faa5d462faa885214dc1af1539532d51d3d6ac7f37504"} Dec 01 10:49:44 crc kubenswrapper[4761]: I1201 10:49:44.566580 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"a0591941-4952-4c56-868f-e1bae8575651","Type":"ContainerStarted","Data":"a8f1ada961510a91b3cbea5c83fd4b9ce6eea75ba8000cf920ef9316c21120ae"} Dec 01 10:49:45 crc kubenswrapper[4761]: I1201 10:49:45.982941 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:45 crc kubenswrapper[4761]: I1201 10:49:45.983378 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.026256 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e1ab633-481e-4dfa-8e73-a179d903bcbd-operator-scripts\") pod \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\" (UID: \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\") " Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.026531 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rwhh\" (UniqueName: \"kubernetes.io/projected/1e1ab633-481e-4dfa-8e73-a179d903bcbd-kube-api-access-7rwhh\") pod \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\" (UID: \"1e1ab633-481e-4dfa-8e73-a179d903bcbd\") " Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.026705 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9xf4\" (UniqueName: \"kubernetes.io/projected/97cfb610-4283-487f-b8be-561479406c65-kube-api-access-t9xf4\") pod \"97cfb610-4283-487f-b8be-561479406c65\" (UID: \"97cfb610-4283-487f-b8be-561479406c65\") " Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.026827 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97cfb610-4283-487f-b8be-561479406c65-operator-scripts\") pod \"97cfb610-4283-487f-b8be-561479406c65\" (UID: \"97cfb610-4283-487f-b8be-561479406c65\") " Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.028009 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e1ab633-481e-4dfa-8e73-a179d903bcbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e1ab633-481e-4dfa-8e73-a179d903bcbd" (UID: "1e1ab633-481e-4dfa-8e73-a179d903bcbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.028035 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97cfb610-4283-487f-b8be-561479406c65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97cfb610-4283-487f-b8be-561479406c65" (UID: "97cfb610-4283-487f-b8be-561479406c65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.032525 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1ab633-481e-4dfa-8e73-a179d903bcbd-kube-api-access-7rwhh" (OuterVolumeSpecName: "kube-api-access-7rwhh") pod "1e1ab633-481e-4dfa-8e73-a179d903bcbd" (UID: "1e1ab633-481e-4dfa-8e73-a179d903bcbd"). InnerVolumeSpecName "kube-api-access-7rwhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.033093 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97cfb610-4283-487f-b8be-561479406c65-kube-api-access-t9xf4" (OuterVolumeSpecName: "kube-api-access-t9xf4") pod "97cfb610-4283-487f-b8be-561479406c65" (UID: "97cfb610-4283-487f-b8be-561479406c65"). InnerVolumeSpecName "kube-api-access-t9xf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.129206 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e1ab633-481e-4dfa-8e73-a179d903bcbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.129256 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rwhh\" (UniqueName: \"kubernetes.io/projected/1e1ab633-481e-4dfa-8e73-a179d903bcbd-kube-api-access-7rwhh\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.129267 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9xf4\" (UniqueName: \"kubernetes.io/projected/97cfb610-4283-487f-b8be-561479406c65-kube-api-access-t9xf4\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.129276 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97cfb610-4283-487f-b8be-561479406c65-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.583663 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-7bztp" event={"ID":"1e1ab633-481e-4dfa-8e73-a179d903bcbd","Type":"ContainerDied","Data":"0e4d1cbaffd6861cf13b69c924c351ba7232269a57ee75a9aa9c89dfacded8b0"} Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.584143 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e4d1cbaffd6861cf13b69c924c351ba7232269a57ee75a9aa9c89dfacded8b0" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.583705 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-7bztp" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.585633 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" event={"ID":"97cfb610-4283-487f-b8be-561479406c65","Type":"ContainerDied","Data":"b2626e4ad08697d8bb6faa5d462faa885214dc1af1539532d51d3d6ac7f37504"} Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.585677 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2626e4ad08697d8bb6faa5d462faa885214dc1af1539532d51d3d6ac7f37504" Dec 01 10:49:46 crc kubenswrapper[4761]: I1201 10:49:46.585685 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.965691 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-mbl47"] Dec 01 10:49:47 crc kubenswrapper[4761]: E1201 10:49:47.965950 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1ab633-481e-4dfa-8e73-a179d903bcbd" containerName="mariadb-database-create" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.965963 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1ab633-481e-4dfa-8e73-a179d903bcbd" containerName="mariadb-database-create" Dec 01 10:49:47 crc kubenswrapper[4761]: E1201 10:49:47.965971 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97cfb610-4283-487f-b8be-561479406c65" containerName="mariadb-account-create-update" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.965977 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cfb610-4283-487f-b8be-561479406c65" containerName="mariadb-account-create-update" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.966112 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1ab633-481e-4dfa-8e73-a179d903bcbd" containerName="mariadb-database-create" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.966135 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="97cfb610-4283-487f-b8be-561479406c65" containerName="mariadb-account-create-update" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.966542 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.970589 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.970691 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-pktsh" Dec 01 10:49:47 crc kubenswrapper[4761]: I1201 10:49:47.976909 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mbl47"] Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.066433 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6fhf\" (UniqueName: \"kubernetes.io/projected/e150cb9a-20fa-441b-8dcf-5c20542f6af8-kube-api-access-v6fhf\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.066520 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-config-data\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.066612 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-db-sync-config-data\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.168142 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-db-sync-config-data\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.168250 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6fhf\" (UniqueName: \"kubernetes.io/projected/e150cb9a-20fa-441b-8dcf-5c20542f6af8-kube-api-access-v6fhf\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.168303 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-config-data\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.173835 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-db-sync-config-data\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.173923 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-config-data\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.194218 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6fhf\" (UniqueName: \"kubernetes.io/projected/e150cb9a-20fa-441b-8dcf-5c20542f6af8-kube-api-access-v6fhf\") pod \"glance-db-sync-mbl47\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:48 crc kubenswrapper[4761]: I1201 10:49:48.287568 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:49:53 crc kubenswrapper[4761]: I1201 10:49:53.638351 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"a0591941-4952-4c56-868f-e1bae8575651","Type":"ContainerStarted","Data":"bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839"} Dec 01 10:49:53 crc kubenswrapper[4761]: I1201 10:49:53.656521 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mbl47"] Dec 01 10:49:53 crc kubenswrapper[4761]: W1201 10:49:53.658187 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode150cb9a_20fa_441b_8dcf_5c20542f6af8.slice/crio-8ca8f024d95ea6910a0e32786237094564cc22b07e9c9543c7706d4d87030d2c WatchSource:0}: Error finding container 8ca8f024d95ea6910a0e32786237094564cc22b07e9c9543c7706d4d87030d2c: Status 404 returned error can't find the container with id 8ca8f024d95ea6910a0e32786237094564cc22b07e9c9543c7706d4d87030d2c Dec 01 10:49:53 crc kubenswrapper[4761]: I1201 10:49:53.667484 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.283994422 podStartE2EDuration="11.667451428s" podCreationTimestamp="2025-12-01 10:49:42 +0000 UTC" firstStartedPulling="2025-12-01 10:49:43.92496722 +0000 UTC m=+1123.228725844" lastFinishedPulling="2025-12-01 10:49:53.308424226 +0000 UTC m=+1132.612182850" observedRunningTime="2025-12-01 10:49:53.66162821 +0000 UTC m=+1132.965386844" watchObservedRunningTime="2025-12-01 10:49:53.667451428 +0000 UTC m=+1132.971210092" Dec 01 10:49:54 crc kubenswrapper[4761]: I1201 10:49:54.647819 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mbl47" event={"ID":"e150cb9a-20fa-441b-8dcf-5c20542f6af8","Type":"ContainerStarted","Data":"8ca8f024d95ea6910a0e32786237094564cc22b07e9c9543c7706d4d87030d2c"} Dec 01 10:50:07 crc kubenswrapper[4761]: I1201 10:50:07.768451 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mbl47" event={"ID":"e150cb9a-20fa-441b-8dcf-5c20542f6af8","Type":"ContainerStarted","Data":"1015a574291915b39dda2e0a34a75e0604515f889835f11ad34dd0de24b79150"} Dec 01 10:50:07 crc kubenswrapper[4761]: I1201 10:50:07.788774 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-mbl47" podStartSLOduration=7.631785961 podStartE2EDuration="20.788754738s" podCreationTimestamp="2025-12-01 10:49:47 +0000 UTC" firstStartedPulling="2025-12-01 10:49:53.661114467 +0000 UTC m=+1132.964873091" lastFinishedPulling="2025-12-01 10:50:06.818083244 +0000 UTC m=+1146.121841868" observedRunningTime="2025-12-01 10:50:07.786531222 +0000 UTC m=+1147.090289866" watchObservedRunningTime="2025-12-01 10:50:07.788754738 +0000 UTC m=+1147.092513372" Dec 01 10:50:13 crc kubenswrapper[4761]: I1201 10:50:13.814644 4761 generic.go:334] "Generic (PLEG): container finished" podID="e150cb9a-20fa-441b-8dcf-5c20542f6af8" containerID="1015a574291915b39dda2e0a34a75e0604515f889835f11ad34dd0de24b79150" exitCode=0 Dec 01 10:50:13 crc kubenswrapper[4761]: I1201 10:50:13.814732 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mbl47" event={"ID":"e150cb9a-20fa-441b-8dcf-5c20542f6af8","Type":"ContainerDied","Data":"1015a574291915b39dda2e0a34a75e0604515f889835f11ad34dd0de24b79150"} Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.275183 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.460286 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6fhf\" (UniqueName: \"kubernetes.io/projected/e150cb9a-20fa-441b-8dcf-5c20542f6af8-kube-api-access-v6fhf\") pod \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.460584 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-config-data\") pod \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.460638 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-db-sync-config-data\") pod \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\" (UID: \"e150cb9a-20fa-441b-8dcf-5c20542f6af8\") " Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.471665 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e150cb9a-20fa-441b-8dcf-5c20542f6af8" (UID: "e150cb9a-20fa-441b-8dcf-5c20542f6af8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.471738 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e150cb9a-20fa-441b-8dcf-5c20542f6af8-kube-api-access-v6fhf" (OuterVolumeSpecName: "kube-api-access-v6fhf") pod "e150cb9a-20fa-441b-8dcf-5c20542f6af8" (UID: "e150cb9a-20fa-441b-8dcf-5c20542f6af8"). InnerVolumeSpecName "kube-api-access-v6fhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.522564 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-config-data" (OuterVolumeSpecName: "config-data") pod "e150cb9a-20fa-441b-8dcf-5c20542f6af8" (UID: "e150cb9a-20fa-441b-8dcf-5c20542f6af8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.563317 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6fhf\" (UniqueName: \"kubernetes.io/projected/e150cb9a-20fa-441b-8dcf-5c20542f6af8-kube-api-access-v6fhf\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.563374 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.563437 4761 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e150cb9a-20fa-441b-8dcf-5c20542f6af8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.835256 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mbl47" event={"ID":"e150cb9a-20fa-441b-8dcf-5c20542f6af8","Type":"ContainerDied","Data":"8ca8f024d95ea6910a0e32786237094564cc22b07e9c9543c7706d4d87030d2c"} Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.835345 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ca8f024d95ea6910a0e32786237094564cc22b07e9c9543c7706d4d87030d2c" Dec 01 10:50:15 crc kubenswrapper[4761]: I1201 10:50:15.835438 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mbl47" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.412889 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:16 crc kubenswrapper[4761]: E1201 10:50:16.414258 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e150cb9a-20fa-441b-8dcf-5c20542f6af8" containerName="glance-db-sync" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.414342 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e150cb9a-20fa-441b-8dcf-5c20542f6af8" containerName="glance-db-sync" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.414568 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e150cb9a-20fa-441b-8dcf-5c20542f6af8" containerName="glance-db-sync" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.415648 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.417579 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-pktsh" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.422804 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.426017 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.427076 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.430544 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.437275 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.443808 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578027 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-nvme\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578075 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-lib-modules\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578107 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578126 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578144 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578160 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-logs\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578176 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-httpd-run\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578195 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-run\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578212 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-logs\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578230 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-run\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578299 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578352 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-dev\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578375 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-nvme\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578401 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-scripts\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578429 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-sys\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578463 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq62j\" (UniqueName: \"kubernetes.io/projected/90c138b4-b65e-45f3-9971-5c5c259c9c01-kube-api-access-bq62j\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578514 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx7t6\" (UniqueName: \"kubernetes.io/projected/79a39a04-4c15-4cf4-bf95-14a70f1e2397-kube-api-access-bx7t6\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578563 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-scripts\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578587 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-httpd-run\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578614 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578630 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-dev\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578818 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-config-data\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578848 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-config-data\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578866 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578879 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-sys\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.578983 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.579009 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.579027 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-lib-modules\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679774 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-logs\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679822 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-httpd-run\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679843 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-run\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679858 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-logs\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679878 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-run\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679895 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679911 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-dev\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679926 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-nvme\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679941 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-scripts\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679953 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-sys\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679972 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq62j\" (UniqueName: \"kubernetes.io/projected/90c138b4-b65e-45f3-9971-5c5c259c9c01-kube-api-access-bq62j\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.679998 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx7t6\" (UniqueName: \"kubernetes.io/projected/79a39a04-4c15-4cf4-bf95-14a70f1e2397-kube-api-access-bx7t6\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680019 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-scripts\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680038 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-httpd-run\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680059 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680071 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-dev\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680091 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-config-data\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680110 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-config-data\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680129 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680142 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-sys\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680180 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680199 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680232 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-lib-modules\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680250 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-nvme\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680270 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-lib-modules\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680289 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680306 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680314 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-logs\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680513 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680322 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680782 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-httpd-run\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680825 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-dev\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.680905 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681109 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-run\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681139 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681156 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-run\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681248 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681295 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681410 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-logs\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681455 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-lib-modules\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681530 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-lib-modules\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681598 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-nvme\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681601 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-nvme\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681666 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-dev\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681848 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681882 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-sys\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.682006 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681940 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.681906 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-sys\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.682104 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-httpd-run\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.686925 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-config-data\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.687022 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-scripts\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.693113 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-scripts\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.693719 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-config-data\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.709130 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.711012 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq62j\" (UniqueName: \"kubernetes.io/projected/90c138b4-b65e-45f3-9971-5c5c259c9c01-kube-api-access-bq62j\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.717183 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx7t6\" (UniqueName: \"kubernetes.io/projected/79a39a04-4c15-4cf4-bf95-14a70f1e2397-kube-api-access-bx7t6\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.726089 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.738851 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.746747 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:16 crc kubenswrapper[4761]: I1201 10:50:16.754891 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.032889 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.196500 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:17 crc kubenswrapper[4761]: W1201 10:50:17.207195 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90c138b4_b65e_45f3_9971_5c5c259c9c01.slice/crio-857e756f3705f01908b6273427f471989afb51814c05eda5e0a341cd905f7ffa WatchSource:0}: Error finding container 857e756f3705f01908b6273427f471989afb51814c05eda5e0a341cd905f7ffa: Status 404 returned error can't find the container with id 857e756f3705f01908b6273427f471989afb51814c05eda5e0a341cd905f7ffa Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.334316 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.466052 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.860262 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"79a39a04-4c15-4cf4-bf95-14a70f1e2397","Type":"ContainerStarted","Data":"107d9d3ccb96d140adcbc770cfe6d1edf82f39d47280c5d8d76f9a51c996cd3e"} Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.860638 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"79a39a04-4c15-4cf4-bf95-14a70f1e2397","Type":"ContainerStarted","Data":"e2942040181efbd47ff8722838d2bafcb963d3be90c47e47049dc9a435552539"} Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.860653 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"79a39a04-4c15-4cf4-bf95-14a70f1e2397","Type":"ContainerStarted","Data":"173996b1af2840a991ee4e47a297bfcef67f28d4634af7c55b76e096e459c1a2"} Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.863818 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"90c138b4-b65e-45f3-9971-5c5c259c9c01","Type":"ContainerStarted","Data":"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb"} Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.864319 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"90c138b4-b65e-45f3-9971-5c5c259c9c01","Type":"ContainerStarted","Data":"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6"} Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.864415 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"90c138b4-b65e-45f3-9971-5c5c259c9c01","Type":"ContainerStarted","Data":"857e756f3705f01908b6273427f471989afb51814c05eda5e0a341cd905f7ffa"} Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.864107 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerName="glance-log" containerID="cri-o://6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6" gracePeriod=30 Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.864181 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerName="glance-httpd" containerID="cri-o://e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb" gracePeriod=30 Dec 01 10:50:17 crc kubenswrapper[4761]: I1201 10:50:17.913230 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.913211117 podStartE2EDuration="1.913211117s" podCreationTimestamp="2025-12-01 10:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:17.908893368 +0000 UTC m=+1157.212651992" watchObservedRunningTime="2025-12-01 10:50:17.913211117 +0000 UTC m=+1157.216969741" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.293593 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.417644 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-scripts\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.417752 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-sys\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.417810 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.417872 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-iscsi\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.417965 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-nvme\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418029 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-lib-modules\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418153 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-config-data\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418240 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-logs\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418299 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq62j\" (UniqueName: \"kubernetes.io/projected/90c138b4-b65e-45f3-9971-5c5c259c9c01-kube-api-access-bq62j\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418355 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-var-locks-brick\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418398 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418461 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-httpd-run\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418607 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-run\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.418675 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-dev\") pod \"90c138b4-b65e-45f3-9971-5c5c259c9c01\" (UID: \"90c138b4-b65e-45f3-9971-5c5c259c9c01\") " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.419377 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-dev" (OuterVolumeSpecName: "dev") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.420839 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-run" (OuterVolumeSpecName: "run") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.421170 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-logs" (OuterVolumeSpecName: "logs") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.421217 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.421244 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-sys" (OuterVolumeSpecName: "sys") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.421298 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.421754 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.421917 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.421981 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.426036 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.426134 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c138b4-b65e-45f3-9971-5c5c259c9c01-kube-api-access-bq62j" (OuterVolumeSpecName: "kube-api-access-bq62j") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "kube-api-access-bq62j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.426398 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.426929 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-scripts" (OuterVolumeSpecName: "scripts") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.474391 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-config-data" (OuterVolumeSpecName: "config-data") pod "90c138b4-b65e-45f3-9971-5c5c259c9c01" (UID: "90c138b4-b65e-45f3-9971-5c5c259c9c01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.520931 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.520969 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.520988 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521015 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521031 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq62j\" (UniqueName: \"kubernetes.io/projected/90c138b4-b65e-45f3-9971-5c5c259c9c01-kube-api-access-bq62j\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521046 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521434 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521465 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90c138b4-b65e-45f3-9971-5c5c259c9c01-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521477 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521488 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521498 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c138b4-b65e-45f3-9971-5c5c259c9c01-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521508 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521525 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.521537 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90c138b4-b65e-45f3-9971-5c5c259c9c01-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.539398 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.545032 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.630159 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.630191 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.875067 4761 generic.go:334] "Generic (PLEG): container finished" podID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerID="e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb" exitCode=143 Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.875121 4761 generic.go:334] "Generic (PLEG): container finished" podID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerID="6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6" exitCode=143 Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.875530 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"90c138b4-b65e-45f3-9971-5c5c259c9c01","Type":"ContainerDied","Data":"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb"} Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.875749 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"90c138b4-b65e-45f3-9971-5c5c259c9c01","Type":"ContainerDied","Data":"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6"} Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.875858 4761 scope.go:117] "RemoveContainer" containerID="e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.876067 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"90c138b4-b65e-45f3-9971-5c5c259c9c01","Type":"ContainerDied","Data":"857e756f3705f01908b6273427f471989afb51814c05eda5e0a341cd905f7ffa"} Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.876353 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.942344 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.951197 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.954616 4761 scope.go:117] "RemoveContainer" containerID="6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.966871 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:18 crc kubenswrapper[4761]: E1201 10:50:18.967211 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerName="glance-httpd" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.967230 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerName="glance-httpd" Dec 01 10:50:18 crc kubenswrapper[4761]: E1201 10:50:18.967242 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerName="glance-log" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.967249 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerName="glance-log" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.967416 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerName="glance-log" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.967449 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" containerName="glance-httpd" Dec 01 10:50:18 crc kubenswrapper[4761]: I1201 10:50:18.968307 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.023507 4761 scope.go:117] "RemoveContainer" containerID="e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb" Dec 01 10:50:19 crc kubenswrapper[4761]: E1201 10:50:19.024477 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb\": container with ID starting with e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb not found: ID does not exist" containerID="e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.024511 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb"} err="failed to get container status \"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb\": rpc error: code = NotFound desc = could not find container \"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb\": container with ID starting with e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb not found: ID does not exist" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.024530 4761 scope.go:117] "RemoveContainer" containerID="6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6" Dec 01 10:50:19 crc kubenswrapper[4761]: E1201 10:50:19.024928 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6\": container with ID starting with 6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6 not found: ID does not exist" containerID="6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.024972 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6"} err="failed to get container status \"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6\": rpc error: code = NotFound desc = could not find container \"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6\": container with ID starting with 6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6 not found: ID does not exist" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.024996 4761 scope.go:117] "RemoveContainer" containerID="e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.025262 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb"} err="failed to get container status \"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb\": rpc error: code = NotFound desc = could not find container \"e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb\": container with ID starting with e783160f1d75bdf3afdabed19de733e5175db0cb4def63016d857413e97e17cb not found: ID does not exist" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.025279 4761 scope.go:117] "RemoveContainer" containerID="6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.025491 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6"} err="failed to get container status \"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6\": rpc error: code = NotFound desc = could not find container \"6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6\": container with ID starting with 6b05184c75facbe18fcd6b3dd4980e45c75edd106e48d158004af719536a6ce6 not found: ID does not exist" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.052573 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.136655 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c138b4-b65e-45f3-9971-5c5c259c9c01" path="/var/lib/kubelet/pods/90c138b4-b65e-45f3-9971-5c5c259c9c01/volumes" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137640 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137682 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137705 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-config-data\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137721 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-scripts\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137738 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-lib-modules\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137775 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-sys\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137853 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-run\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137895 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137913 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-logs\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137926 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-dev\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.137945 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.138096 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4gj\" (UniqueName: \"kubernetes.io/projected/e187f635-6975-4330-b4ff-c24d05486965-kube-api-access-hl4gj\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.138121 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-httpd-run\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.138135 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-nvme\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240042 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-scripts\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240116 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-lib-modules\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240176 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-sys\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240227 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-run\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240266 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240286 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-logs\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240309 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-dev\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240350 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240397 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4gj\" (UniqueName: \"kubernetes.io/projected/e187f635-6975-4330-b4ff-c24d05486965-kube-api-access-hl4gj\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240422 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-httpd-run\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240444 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-nvme\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240471 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240510 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.240535 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-config-data\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241301 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-dev\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241497 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241533 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-lib-modules\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241574 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241596 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-run\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241649 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-nvme\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241719 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-sys\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241911 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.241980 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.242983 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-httpd-run\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.243020 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-logs\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.247087 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-scripts\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.252694 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-config-data\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.261866 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.267596 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4gj\" (UniqueName: \"kubernetes.io/projected/e187f635-6975-4330-b4ff-c24d05486965-kube-api-access-hl4gj\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.284606 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:19 crc kubenswrapper[4761]: I1201 10:50:19.335264 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:20 crc kubenswrapper[4761]: W1201 10:50:20.002805 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode187f635_6975_4330_b4ff_c24d05486965.slice/crio-7f7488913a80a290f035eb70c4737fe590d7aa1570251241f942af8da490b7df WatchSource:0}: Error finding container 7f7488913a80a290f035eb70c4737fe590d7aa1570251241f942af8da490b7df: Status 404 returned error can't find the container with id 7f7488913a80a290f035eb70c4737fe590d7aa1570251241f942af8da490b7df Dec 01 10:50:20 crc kubenswrapper[4761]: I1201 10:50:20.004257 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:50:20 crc kubenswrapper[4761]: I1201 10:50:20.895412 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"e187f635-6975-4330-b4ff-c24d05486965","Type":"ContainerStarted","Data":"67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2"} Dec 01 10:50:20 crc kubenswrapper[4761]: I1201 10:50:20.896177 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"e187f635-6975-4330-b4ff-c24d05486965","Type":"ContainerStarted","Data":"7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050"} Dec 01 10:50:20 crc kubenswrapper[4761]: I1201 10:50:20.896206 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"e187f635-6975-4330-b4ff-c24d05486965","Type":"ContainerStarted","Data":"7f7488913a80a290f035eb70c4737fe590d7aa1570251241f942af8da490b7df"} Dec 01 10:50:20 crc kubenswrapper[4761]: I1201 10:50:20.935661 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.935544648 podStartE2EDuration="2.935544648s" podCreationTimestamp="2025-12-01 10:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:20.925532545 +0000 UTC m=+1160.229291209" watchObservedRunningTime="2025-12-01 10:50:20.935544648 +0000 UTC m=+1160.239303302" Dec 01 10:50:27 crc kubenswrapper[4761]: I1201 10:50:27.033751 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:27 crc kubenswrapper[4761]: I1201 10:50:27.034396 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:27 crc kubenswrapper[4761]: I1201 10:50:27.076959 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:27 crc kubenswrapper[4761]: I1201 10:50:27.093623 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:27 crc kubenswrapper[4761]: I1201 10:50:27.954867 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:27 crc kubenswrapper[4761]: I1201 10:50:27.955302 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:29 crc kubenswrapper[4761]: I1201 10:50:29.336238 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:29 crc kubenswrapper[4761]: I1201 10:50:29.337610 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:29 crc kubenswrapper[4761]: I1201 10:50:29.453300 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:29 crc kubenswrapper[4761]: I1201 10:50:29.479490 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:29 crc kubenswrapper[4761]: I1201 10:50:29.967231 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:29 crc kubenswrapper[4761]: I1201 10:50:29.967373 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:30 crc kubenswrapper[4761]: I1201 10:50:30.419345 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:30 crc kubenswrapper[4761]: I1201 10:50:30.419459 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:50:30 crc kubenswrapper[4761]: I1201 10:50:30.488135 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:32 crc kubenswrapper[4761]: I1201 10:50:32.022010 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:32 crc kubenswrapper[4761]: I1201 10:50:32.022434 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:50:32 crc kubenswrapper[4761]: I1201 10:50:32.136539 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:50:32 crc kubenswrapper[4761]: I1201 10:50:32.192303 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:32 crc kubenswrapper[4761]: I1201 10:50:32.192665 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerName="glance-log" containerID="cri-o://e2942040181efbd47ff8722838d2bafcb963d3be90c47e47049dc9a435552539" gracePeriod=30 Dec 01 10:50:32 crc kubenswrapper[4761]: I1201 10:50:32.192754 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerName="glance-httpd" containerID="cri-o://107d9d3ccb96d140adcbc770cfe6d1edf82f39d47280c5d8d76f9a51c996cd3e" gracePeriod=30 Dec 01 10:50:33 crc kubenswrapper[4761]: I1201 10:50:32.999867 4761 generic.go:334] "Generic (PLEG): container finished" podID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerID="e2942040181efbd47ff8722838d2bafcb963d3be90c47e47049dc9a435552539" exitCode=143 Dec 01 10:50:33 crc kubenswrapper[4761]: I1201 10:50:32.999999 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"79a39a04-4c15-4cf4-bf95-14a70f1e2397","Type":"ContainerDied","Data":"e2942040181efbd47ff8722838d2bafcb963d3be90c47e47049dc9a435552539"} Dec 01 10:50:36 crc kubenswrapper[4761]: I1201 10:50:36.069322 4761 generic.go:334] "Generic (PLEG): container finished" podID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerID="107d9d3ccb96d140adcbc770cfe6d1edf82f39d47280c5d8d76f9a51c996cd3e" exitCode=0 Dec 01 10:50:36 crc kubenswrapper[4761]: I1201 10:50:36.069633 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"79a39a04-4c15-4cf4-bf95-14a70f1e2397","Type":"ContainerDied","Data":"107d9d3ccb96d140adcbc770cfe6d1edf82f39d47280c5d8d76f9a51c996cd3e"} Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.335455 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459173 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-var-locks-brick\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459256 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-scripts\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459282 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459317 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459341 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx7t6\" (UniqueName: \"kubernetes.io/projected/79a39a04-4c15-4cf4-bf95-14a70f1e2397-kube-api-access-bx7t6\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459369 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-iscsi\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459389 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-lib-modules\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459434 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-run\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459483 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-dev\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459505 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-httpd-run\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459602 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-logs\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459641 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-nvme\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459704 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-config-data\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459731 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-sys\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.459772 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\" (UID: \"79a39a04-4c15-4cf4-bf95-14a70f1e2397\") " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.460098 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.460349 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-dev" (OuterVolumeSpecName: "dev") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.460386 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.460410 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.460435 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-run" (OuterVolumeSpecName: "run") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.460456 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.460677 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.460907 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-logs" (OuterVolumeSpecName: "logs") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.461670 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-sys" (OuterVolumeSpecName: "sys") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.465148 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance-cache") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.465774 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-scripts" (OuterVolumeSpecName: "scripts") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.466647 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.474915 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a39a04-4c15-4cf4-bf95-14a70f1e2397-kube-api-access-bx7t6" (OuterVolumeSpecName: "kube-api-access-bx7t6") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "kube-api-access-bx7t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.504223 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-config-data" (OuterVolumeSpecName: "config-data") pod "79a39a04-4c15-4cf4-bf95-14a70f1e2397" (UID: "79a39a04-4c15-4cf4-bf95-14a70f1e2397"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561676 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561718 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561743 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561756 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx7t6\" (UniqueName: \"kubernetes.io/projected/79a39a04-4c15-4cf4-bf95-14a70f1e2397-kube-api-access-bx7t6\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561771 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561786 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561798 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561809 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561820 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561829 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a39a04-4c15-4cf4-bf95-14a70f1e2397-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561840 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561851 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a39a04-4c15-4cf4-bf95-14a70f1e2397-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.561862 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79a39a04-4c15-4cf4-bf95-14a70f1e2397-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.583063 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.586397 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.663226 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:37 crc kubenswrapper[4761]: I1201 10:50:37.663260 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.089337 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"79a39a04-4c15-4cf4-bf95-14a70f1e2397","Type":"ContainerDied","Data":"173996b1af2840a991ee4e47a297bfcef67f28d4634af7c55b76e096e459c1a2"} Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.089404 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.089444 4761 scope.go:117] "RemoveContainer" containerID="107d9d3ccb96d140adcbc770cfe6d1edf82f39d47280c5d8d76f9a51c996cd3e" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.142617 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.143662 4761 scope.go:117] "RemoveContainer" containerID="e2942040181efbd47ff8722838d2bafcb963d3be90c47e47049dc9a435552539" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.146124 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.157273 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:38 crc kubenswrapper[4761]: E1201 10:50:38.157626 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerName="glance-httpd" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.157644 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerName="glance-httpd" Dec 01 10:50:38 crc kubenswrapper[4761]: E1201 10:50:38.157663 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerName="glance-log" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.157669 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerName="glance-log" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.157796 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerName="glance-httpd" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.157827 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" containerName="glance-log" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.158613 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.211405 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.272504 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.272555 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-config-data\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.272652 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-logs\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273180 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-nvme\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273207 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273225 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-run\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273248 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bcls\" (UniqueName: \"kubernetes.io/projected/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-kube-api-access-8bcls\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273269 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273283 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273300 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-httpd-run\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273316 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-scripts\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273341 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-dev\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273366 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-sys\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.273439 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-lib-modules\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.374800 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bcls\" (UniqueName: \"kubernetes.io/projected/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-kube-api-access-8bcls\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.374845 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.374861 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.374879 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-httpd-run\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.374897 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-scripts\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.374923 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-dev\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.374949 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-sys\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.374988 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-lib-modules\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375077 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375105 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375121 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-config-data\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375059 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-lib-modules\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375043 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-dev\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375169 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375041 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375081 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-sys\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375411 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-httpd-run\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375585 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-logs\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375642 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-nvme\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375679 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375731 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-run\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375796 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-nvme\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375893 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-run\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375918 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.375934 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-logs\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.378306 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-scripts\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.379734 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-config-data\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.393535 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.396227 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bcls\" (UniqueName: \"kubernetes.io/projected/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-kube-api-access-8bcls\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.397311 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.511395 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:38 crc kubenswrapper[4761]: I1201 10:50:38.996059 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:50:39 crc kubenswrapper[4761]: I1201 10:50:39.104137 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3","Type":"ContainerStarted","Data":"242c53ed07f62ea6acf52654671cde08251d5c17f2b0aead7bed4d8ec8de7574"} Dec 01 10:50:39 crc kubenswrapper[4761]: I1201 10:50:39.144747 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a39a04-4c15-4cf4-bf95-14a70f1e2397" path="/var/lib/kubelet/pods/79a39a04-4c15-4cf4-bf95-14a70f1e2397/volumes" Dec 01 10:50:40 crc kubenswrapper[4761]: I1201 10:50:40.111799 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3","Type":"ContainerStarted","Data":"f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4"} Dec 01 10:50:40 crc kubenswrapper[4761]: I1201 10:50:40.112270 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3","Type":"ContainerStarted","Data":"4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606"} Dec 01 10:50:40 crc kubenswrapper[4761]: I1201 10:50:40.140435 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.140415527 podStartE2EDuration="2.140415527s" podCreationTimestamp="2025-12-01 10:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:40.138499789 +0000 UTC m=+1179.442258413" watchObservedRunningTime="2025-12-01 10:50:40.140415527 +0000 UTC m=+1179.444174151" Dec 01 10:50:48 crc kubenswrapper[4761]: I1201 10:50:48.511984 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:48 crc kubenswrapper[4761]: I1201 10:50:48.512453 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:48 crc kubenswrapper[4761]: I1201 10:50:48.556995 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:48 crc kubenswrapper[4761]: I1201 10:50:48.583452 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:49 crc kubenswrapper[4761]: I1201 10:50:49.242812 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:49 crc kubenswrapper[4761]: I1201 10:50:49.242852 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:51 crc kubenswrapper[4761]: I1201 10:50:51.259052 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:50:51 crc kubenswrapper[4761]: I1201 10:50:51.260142 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:50:51 crc kubenswrapper[4761]: I1201 10:50:51.428733 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:50:51 crc kubenswrapper[4761]: I1201 10:50:51.488526 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:03 crc kubenswrapper[4761]: I1201 10:51:03.850625 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:51:03 crc kubenswrapper[4761]: I1201 10:51:03.851075 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.449557 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mbl47"] Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.466892 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mbl47"] Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.491420 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance2b4d-account-delete-2xpb6"] Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.492232 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.503800 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance2b4d-account-delete-2xpb6"] Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.532449 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-operator-scripts\") pod \"glance2b4d-account-delete-2xpb6\" (UID: \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\") " pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.532587 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k56mx\" (UniqueName: \"kubernetes.io/projected/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-kube-api-access-k56mx\") pod \"glance2b4d-account-delete-2xpb6\" (UID: \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\") " pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.552712 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.553039 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-log" containerID="cri-o://4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606" gracePeriod=30 Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.553112 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-httpd" containerID="cri-o://f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4" gracePeriod=30 Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.570107 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.570608 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="e187f635-6975-4330-b4ff-c24d05486965" containerName="glance-log" containerID="cri-o://7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050" gracePeriod=30 Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.570698 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="e187f635-6975-4330-b4ff-c24d05486965" containerName="glance-httpd" containerID="cri-o://67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2" gracePeriod=30 Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.633606 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-operator-scripts\") pod \"glance2b4d-account-delete-2xpb6\" (UID: \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\") " pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.633679 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k56mx\" (UniqueName: \"kubernetes.io/projected/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-kube-api-access-k56mx\") pod \"glance2b4d-account-delete-2xpb6\" (UID: \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\") " pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.634762 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-operator-scripts\") pod \"glance2b4d-account-delete-2xpb6\" (UID: \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\") " pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.642250 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.642428 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="a0591941-4952-4c56-868f-e1bae8575651" containerName="openstackclient" containerID="cri-o://bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839" gracePeriod=30 Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.662797 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k56mx\" (UniqueName: \"kubernetes.io/projected/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-kube-api-access-k56mx\") pod \"glance2b4d-account-delete-2xpb6\" (UID: \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\") " pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.812878 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:07 crc kubenswrapper[4761]: E1201 10:51:07.833041 4761 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0591941_4952_4c56_868f_e1bae8575651.slice/crio-conmon-bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17e0b5c2_a745_47ad_b70c_f9e84f44e4e3.slice/crio-4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:51:07 crc kubenswrapper[4761]: I1201 10:51:07.973014 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.038470 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0591941-4952-4c56-868f-e1bae8575651-openstack-config-secret\") pod \"a0591941-4952-4c56-868f-e1bae8575651\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.038768 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-scripts\") pod \"a0591941-4952-4c56-868f-e1bae8575651\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.038795 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-config\") pod \"a0591941-4952-4c56-868f-e1bae8575651\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.038824 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n62hc\" (UniqueName: \"kubernetes.io/projected/a0591941-4952-4c56-868f-e1bae8575651-kube-api-access-n62hc\") pod \"a0591941-4952-4c56-868f-e1bae8575651\" (UID: \"a0591941-4952-4c56-868f-e1bae8575651\") " Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.040410 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "a0591941-4952-4c56-868f-e1bae8575651" (UID: "a0591941-4952-4c56-868f-e1bae8575651"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.046202 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0591941-4952-4c56-868f-e1bae8575651-kube-api-access-n62hc" (OuterVolumeSpecName: "kube-api-access-n62hc") pod "a0591941-4952-4c56-868f-e1bae8575651" (UID: "a0591941-4952-4c56-868f-e1bae8575651"). InnerVolumeSpecName "kube-api-access-n62hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.060349 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a0591941-4952-4c56-868f-e1bae8575651" (UID: "a0591941-4952-4c56-868f-e1bae8575651"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.066803 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0591941-4952-4c56-868f-e1bae8575651-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a0591941-4952-4c56-868f-e1bae8575651" (UID: "a0591941-4952-4c56-868f-e1bae8575651"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.141382 4761 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0591941-4952-4c56-868f-e1bae8575651-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.141438 4761 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.141452 4761 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0591941-4952-4c56-868f-e1bae8575651-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.141465 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n62hc\" (UniqueName: \"kubernetes.io/projected/a0591941-4952-4c56-868f-e1bae8575651-kube-api-access-n62hc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.266298 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance2b4d-account-delete-2xpb6"] Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.414271 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" event={"ID":"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8","Type":"ContainerStarted","Data":"a2881f05b73776cc11ef0129b011d345df6fb637b99effa97aa1965b31e9911b"} Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.420057 4761 generic.go:334] "Generic (PLEG): container finished" podID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerID="4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606" exitCode=143 Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.420123 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3","Type":"ContainerDied","Data":"4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606"} Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.422821 4761 generic.go:334] "Generic (PLEG): container finished" podID="a0591941-4952-4c56-868f-e1bae8575651" containerID="bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839" exitCode=143 Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.422870 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"a0591941-4952-4c56-868f-e1bae8575651","Type":"ContainerDied","Data":"bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839"} Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.422891 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"a0591941-4952-4c56-868f-e1bae8575651","Type":"ContainerDied","Data":"a8f1ada961510a91b3cbea5c83fd4b9ce6eea75ba8000cf920ef9316c21120ae"} Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.422908 4761 scope.go:117] "RemoveContainer" containerID="bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.423031 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.427885 4761 generic.go:334] "Generic (PLEG): container finished" podID="e187f635-6975-4330-b4ff-c24d05486965" containerID="7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050" exitCode=143 Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.427916 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"e187f635-6975-4330-b4ff-c24d05486965","Type":"ContainerDied","Data":"7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050"} Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.452771 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.458661 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.459900 4761 scope.go:117] "RemoveContainer" containerID="bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839" Dec 01 10:51:08 crc kubenswrapper[4761]: E1201 10:51:08.460288 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839\": container with ID starting with bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839 not found: ID does not exist" containerID="bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839" Dec 01 10:51:08 crc kubenswrapper[4761]: I1201 10:51:08.460338 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839"} err="failed to get container status \"bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839\": rpc error: code = NotFound desc = could not find container \"bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839\": container with ID starting with bdd76fdf6fd3180a05b0b8883554662a1b8ffebc80c38d16987de9fe13398839 not found: ID does not exist" Dec 01 10:51:09 crc kubenswrapper[4761]: I1201 10:51:09.142280 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0591941-4952-4c56-868f-e1bae8575651" path="/var/lib/kubelet/pods/a0591941-4952-4c56-868f-e1bae8575651/volumes" Dec 01 10:51:09 crc kubenswrapper[4761]: I1201 10:51:09.143789 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e150cb9a-20fa-441b-8dcf-5c20542f6af8" path="/var/lib/kubelet/pods/e150cb9a-20fa-441b-8dcf-5c20542f6af8/volumes" Dec 01 10:51:09 crc kubenswrapper[4761]: I1201 10:51:09.442780 4761 generic.go:334] "Generic (PLEG): container finished" podID="5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8" containerID="4fe4fb3bc22aa0fba9ed8bad95da774a5c2e52fc7fd5ef7098a06bc812285540" exitCode=0 Dec 01 10:51:09 crc kubenswrapper[4761]: I1201 10:51:09.442868 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" event={"ID":"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8","Type":"ContainerDied","Data":"4fe4fb3bc22aa0fba9ed8bad95da774a5c2e52fc7fd5ef7098a06bc812285540"} Dec 01 10:51:10 crc kubenswrapper[4761]: I1201 10:51:10.756974 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.104:9292/healthcheck\": read tcp 10.217.0.2:41114->10.217.0.104:9292: read: connection reset by peer" Dec 01 10:51:10 crc kubenswrapper[4761]: I1201 10:51:10.757043 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.104:9292/healthcheck\": read tcp 10.217.0.2:41126->10.217.0.104:9292: read: connection reset by peer" Dec 01 10:51:10 crc kubenswrapper[4761]: I1201 10:51:10.975384 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.134110 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.169903 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k56mx\" (UniqueName: \"kubernetes.io/projected/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-kube-api-access-k56mx\") pod \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\" (UID: \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.169971 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-operator-scripts\") pod \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\" (UID: \"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.171038 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8" (UID: "5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.179304 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-kube-api-access-k56mx" (OuterVolumeSpecName: "kube-api-access-k56mx") pod "5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8" (UID: "5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8"). InnerVolumeSpecName "kube-api-access-k56mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.207093 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271625 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271772 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-run\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271809 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-lib-modules\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271823 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-sys\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271851 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-logs\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271869 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271859 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-run" (OuterVolumeSpecName: "run") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271896 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-httpd-run\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271928 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271923 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-sys" (OuterVolumeSpecName: "sys") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271939 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-var-locks-brick\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272042 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-scripts\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.271981 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272067 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-nvme\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272084 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272106 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-dev\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272146 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl4gj\" (UniqueName: \"kubernetes.io/projected/e187f635-6975-4330-b4ff-c24d05486965-kube-api-access-hl4gj\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272163 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-iscsi\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272189 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-config-data\") pod \"e187f635-6975-4330-b4ff-c24d05486965\" (UID: \"e187f635-6975-4330-b4ff-c24d05486965\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272195 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-dev" (OuterVolumeSpecName: "dev") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272265 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272306 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272474 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-logs" (OuterVolumeSpecName: "logs") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272517 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k56mx\" (UniqueName: \"kubernetes.io/projected/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-kube-api-access-k56mx\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.272694 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.273204 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.273234 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.273305 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.273327 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.273339 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.273351 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.273362 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.273374 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e187f635-6975-4330-b4ff-c24d05486965-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.274804 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.274901 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e187f635-6975-4330-b4ff-c24d05486965-kube-api-access-hl4gj" (OuterVolumeSpecName: "kube-api-access-hl4gj") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "kube-api-access-hl4gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.275308 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.281770 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-scripts" (OuterVolumeSpecName: "scripts") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.306023 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-config-data" (OuterVolumeSpecName: "config-data") pod "e187f635-6975-4330-b4ff-c24d05486965" (UID: "e187f635-6975-4330-b4ff-c24d05486965"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374225 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-run\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374288 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-sys\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374340 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-logs\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374362 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-httpd-run\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374401 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-var-locks-brick\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374440 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-lib-modules\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374472 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374502 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-iscsi\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374538 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-dev\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374590 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374628 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-nvme\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374682 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bcls\" (UniqueName: \"kubernetes.io/projected/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-kube-api-access-8bcls\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374718 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-scripts\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374778 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-config-data\") pod \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\" (UID: \"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3\") " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.374848 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.375185 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.375187 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.375212 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.375244 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-run" (OuterVolumeSpecName: "run") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.375611 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-sys" (OuterVolumeSpecName: "sys") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.375621 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-logs" (OuterVolumeSpecName: "logs") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.375655 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-dev" (OuterVolumeSpecName: "dev") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.375976 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376039 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl4gj\" (UniqueName: \"kubernetes.io/projected/e187f635-6975-4330-b4ff-c24d05486965-kube-api-access-hl4gj\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376056 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376066 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376075 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376095 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376104 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376113 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e187f635-6975-4330-b4ff-c24d05486965-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376125 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376133 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376141 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376149 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376159 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e187f635-6975-4330-b4ff-c24d05486965-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376170 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.376183 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.378122 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.378620 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-kube-api-access-8bcls" (OuterVolumeSpecName: "kube-api-access-8bcls") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "kube-api-access-8bcls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.380793 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-scripts" (OuterVolumeSpecName: "scripts") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.382019 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance-cache") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.394317 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.394537 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.409758 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-config-data" (OuterVolumeSpecName: "config-data") pod "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" (UID: "17e0b5c2-a745-47ad-b70c-f9e84f44e4e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.460271 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" event={"ID":"5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8","Type":"ContainerDied","Data":"a2881f05b73776cc11ef0129b011d345df6fb637b99effa97aa1965b31e9911b"} Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.460284 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2b4d-account-delete-2xpb6" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.460371 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2881f05b73776cc11ef0129b011d345df6fb637b99effa97aa1965b31e9911b" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.461862 4761 generic.go:334] "Generic (PLEG): container finished" podID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerID="f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4" exitCode=0 Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.461912 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.461968 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3","Type":"ContainerDied","Data":"f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4"} Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.462134 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"17e0b5c2-a745-47ad-b70c-f9e84f44e4e3","Type":"ContainerDied","Data":"242c53ed07f62ea6acf52654671cde08251d5c17f2b0aead7bed4d8ec8de7574"} Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.462213 4761 scope.go:117] "RemoveContainer" containerID="f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.465884 4761 generic.go:334] "Generic (PLEG): container finished" podID="e187f635-6975-4330-b4ff-c24d05486965" containerID="67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2" exitCode=0 Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.465924 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.465947 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"e187f635-6975-4330-b4ff-c24d05486965","Type":"ContainerDied","Data":"67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2"} Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.466019 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"e187f635-6975-4330-b4ff-c24d05486965","Type":"ContainerDied","Data":"7f7488913a80a290f035eb70c4737fe590d7aa1570251241f942af8da490b7df"} Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.477954 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.478968 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.478996 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.479009 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bcls\" (UniqueName: \"kubernetes.io/projected/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-kube-api-access-8bcls\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.479021 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.479032 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.479044 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.479064 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.503403 4761 scope.go:117] "RemoveContainer" containerID="4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.503776 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.505118 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.507610 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.515699 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.524759 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.525908 4761 scope.go:117] "RemoveContainer" containerID="f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4" Dec 01 10:51:11 crc kubenswrapper[4761]: E1201 10:51:11.527449 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4\": container with ID starting with f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4 not found: ID does not exist" containerID="f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.527511 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4"} err="failed to get container status \"f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4\": rpc error: code = NotFound desc = could not find container \"f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4\": container with ID starting with f58ddb35ff6b1e9353b78fb690ecc3c01d88a644d18f1cfb74367f2661647cc4 not found: ID does not exist" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.527600 4761 scope.go:117] "RemoveContainer" containerID="4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606" Dec 01 10:51:11 crc kubenswrapper[4761]: E1201 10:51:11.527950 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606\": container with ID starting with 4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606 not found: ID does not exist" containerID="4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.527986 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606"} err="failed to get container status \"4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606\": rpc error: code = NotFound desc = could not find container \"4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606\": container with ID starting with 4408ea8a99957b1c63c25730fd5fca36a2bcea1a1c878e2dfe63e4c565a3f606 not found: ID does not exist" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.528025 4761 scope.go:117] "RemoveContainer" containerID="67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.533677 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.552047 4761 scope.go:117] "RemoveContainer" containerID="7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.567919 4761 scope.go:117] "RemoveContainer" containerID="67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2" Dec 01 10:51:11 crc kubenswrapper[4761]: E1201 10:51:11.568353 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2\": container with ID starting with 67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2 not found: ID does not exist" containerID="67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.568394 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2"} err="failed to get container status \"67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2\": rpc error: code = NotFound desc = could not find container \"67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2\": container with ID starting with 67a973a6d0c4b8b5df2082c6ee7487c2280a26b342ddba58a21810187d0892c2 not found: ID does not exist" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.568427 4761 scope.go:117] "RemoveContainer" containerID="7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050" Dec 01 10:51:11 crc kubenswrapper[4761]: E1201 10:51:11.568923 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050\": container with ID starting with 7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050 not found: ID does not exist" containerID="7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.568972 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050"} err="failed to get container status \"7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050\": rpc error: code = NotFound desc = could not find container \"7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050\": container with ID starting with 7b6665423eaed65c947edc55d436a305f72e61486fde9d681aa37ca0260e0050 not found: ID does not exist" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.580167 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4761]: I1201 10:51:11.580359 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:12 crc kubenswrapper[4761]: I1201 10:51:12.536864 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-7bztp"] Dec 01 10:51:12 crc kubenswrapper[4761]: I1201 10:51:12.553474 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-7bztp"] Dec 01 10:51:12 crc kubenswrapper[4761]: I1201 10:51:12.563590 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance2b4d-account-delete-2xpb6"] Dec 01 10:51:12 crc kubenswrapper[4761]: I1201 10:51:12.570910 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx"] Dec 01 10:51:12 crc kubenswrapper[4761]: I1201 10:51:12.577656 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance2b4d-account-delete-2xpb6"] Dec 01 10:51:12 crc kubenswrapper[4761]: I1201 10:51:12.584948 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-2b4d-account-create-update-x6kjx"] Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.144941 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" path="/var/lib/kubelet/pods/17e0b5c2-a745-47ad-b70c-f9e84f44e4e3/volumes" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.146844 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1ab633-481e-4dfa-8e73-a179d903bcbd" path="/var/lib/kubelet/pods/1e1ab633-481e-4dfa-8e73-a179d903bcbd/volumes" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.148090 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8" path="/var/lib/kubelet/pods/5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8/volumes" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.150516 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97cfb610-4283-487f-b8be-561479406c65" path="/var/lib/kubelet/pods/97cfb610-4283-487f-b8be-561479406c65/volumes" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.151791 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e187f635-6975-4330-b4ff-c24d05486965" path="/var/lib/kubelet/pods/e187f635-6975-4330-b4ff-c24d05486965/volumes" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.987495 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-rmlls"] Dec 01 10:51:13 crc kubenswrapper[4761]: E1201 10:51:13.989168 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e187f635-6975-4330-b4ff-c24d05486965" containerName="glance-log" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.989293 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e187f635-6975-4330-b4ff-c24d05486965" containerName="glance-log" Dec 01 10:51:13 crc kubenswrapper[4761]: E1201 10:51:13.989414 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e187f635-6975-4330-b4ff-c24d05486965" containerName="glance-httpd" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.989925 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e187f635-6975-4330-b4ff-c24d05486965" containerName="glance-httpd" Dec 01 10:51:13 crc kubenswrapper[4761]: E1201 10:51:13.990037 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-httpd" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.990119 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-httpd" Dec 01 10:51:13 crc kubenswrapper[4761]: E1201 10:51:13.990194 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-log" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.990264 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-log" Dec 01 10:51:13 crc kubenswrapper[4761]: E1201 10:51:13.990332 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8" containerName="mariadb-account-delete" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.990398 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8" containerName="mariadb-account-delete" Dec 01 10:51:13 crc kubenswrapper[4761]: E1201 10:51:13.990483 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0591941-4952-4c56-868f-e1bae8575651" containerName="openstackclient" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.990570 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0591941-4952-4c56-868f-e1bae8575651" containerName="openstackclient" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.990819 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e187f635-6975-4330-b4ff-c24d05486965" containerName="glance-httpd" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.990915 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e187f635-6975-4330-b4ff-c24d05486965" containerName="glance-log" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.990999 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8cb1d9-3385-49c3-8db9-afbbb5c4e7d8" containerName="mariadb-account-delete" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.991075 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0591941-4952-4c56-868f-e1bae8575651" containerName="openstackclient" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.991149 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-log" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.991217 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e0b5c2-a745-47ad-b70c-f9e84f44e4e3" containerName="glance-httpd" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.991853 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:13 crc kubenswrapper[4761]: I1201 10:51:13.997433 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-rmlls"] Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.007595 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-81d0-account-create-update-7kqgw"] Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.013292 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.019745 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.035618 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-81d0-account-create-update-7kqgw"] Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.122033 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcqx4\" (UniqueName: \"kubernetes.io/projected/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-kube-api-access-rcqx4\") pod \"glance-db-create-rmlls\" (UID: \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\") " pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.122336 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rns6p\" (UniqueName: \"kubernetes.io/projected/7e870ec2-88bf-4afb-abf8-358bb03a5910-kube-api-access-rns6p\") pod \"glance-81d0-account-create-update-7kqgw\" (UID: \"7e870ec2-88bf-4afb-abf8-358bb03a5910\") " pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.122455 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e870ec2-88bf-4afb-abf8-358bb03a5910-operator-scripts\") pod \"glance-81d0-account-create-update-7kqgw\" (UID: \"7e870ec2-88bf-4afb-abf8-358bb03a5910\") " pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.122585 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-operator-scripts\") pod \"glance-db-create-rmlls\" (UID: \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\") " pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.223906 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rns6p\" (UniqueName: \"kubernetes.io/projected/7e870ec2-88bf-4afb-abf8-358bb03a5910-kube-api-access-rns6p\") pod \"glance-81d0-account-create-update-7kqgw\" (UID: \"7e870ec2-88bf-4afb-abf8-358bb03a5910\") " pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.223992 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e870ec2-88bf-4afb-abf8-358bb03a5910-operator-scripts\") pod \"glance-81d0-account-create-update-7kqgw\" (UID: \"7e870ec2-88bf-4afb-abf8-358bb03a5910\") " pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.224077 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-operator-scripts\") pod \"glance-db-create-rmlls\" (UID: \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\") " pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.224129 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcqx4\" (UniqueName: \"kubernetes.io/projected/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-kube-api-access-rcqx4\") pod \"glance-db-create-rmlls\" (UID: \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\") " pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.225477 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e870ec2-88bf-4afb-abf8-358bb03a5910-operator-scripts\") pod \"glance-81d0-account-create-update-7kqgw\" (UID: \"7e870ec2-88bf-4afb-abf8-358bb03a5910\") " pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.225495 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-operator-scripts\") pod \"glance-db-create-rmlls\" (UID: \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\") " pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.243775 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcqx4\" (UniqueName: \"kubernetes.io/projected/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-kube-api-access-rcqx4\") pod \"glance-db-create-rmlls\" (UID: \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\") " pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.245917 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rns6p\" (UniqueName: \"kubernetes.io/projected/7e870ec2-88bf-4afb-abf8-358bb03a5910-kube-api-access-rns6p\") pod \"glance-81d0-account-create-update-7kqgw\" (UID: \"7e870ec2-88bf-4afb-abf8-358bb03a5910\") " pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.326229 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.348828 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.810519 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-81d0-account-create-update-7kqgw"] Dec 01 10:51:14 crc kubenswrapper[4761]: W1201 10:51:14.819758 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e870ec2_88bf_4afb_abf8_358bb03a5910.slice/crio-c2fbe1aa777c2306717d34b14f313be2d2d5b0051c19450e7cb4b5ba41b2f00c WatchSource:0}: Error finding container c2fbe1aa777c2306717d34b14f313be2d2d5b0051c19450e7cb4b5ba41b2f00c: Status 404 returned error can't find the container with id c2fbe1aa777c2306717d34b14f313be2d2d5b0051c19450e7cb4b5ba41b2f00c Dec 01 10:51:14 crc kubenswrapper[4761]: I1201 10:51:14.879838 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-rmlls"] Dec 01 10:51:15 crc kubenswrapper[4761]: I1201 10:51:15.517175 4761 generic.go:334] "Generic (PLEG): container finished" podID="7e870ec2-88bf-4afb-abf8-358bb03a5910" containerID="2f0c69f6996136859db1de8b25d96f5e0c6b7504d95c33d4b7549f81caed8103" exitCode=0 Dec 01 10:51:15 crc kubenswrapper[4761]: I1201 10:51:15.517277 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" event={"ID":"7e870ec2-88bf-4afb-abf8-358bb03a5910","Type":"ContainerDied","Data":"2f0c69f6996136859db1de8b25d96f5e0c6b7504d95c33d4b7549f81caed8103"} Dec 01 10:51:15 crc kubenswrapper[4761]: I1201 10:51:15.517856 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" event={"ID":"7e870ec2-88bf-4afb-abf8-358bb03a5910","Type":"ContainerStarted","Data":"c2fbe1aa777c2306717d34b14f313be2d2d5b0051c19450e7cb4b5ba41b2f00c"} Dec 01 10:51:15 crc kubenswrapper[4761]: I1201 10:51:15.520448 4761 generic.go:334] "Generic (PLEG): container finished" podID="b3e2fedd-9ddb-453f-8c5d-255a16963ce0" containerID="39f373ef322b45ef2ed9f8ed5ff93459091bfbbf73ee766cce03d076759a4a27" exitCode=0 Dec 01 10:51:15 crc kubenswrapper[4761]: I1201 10:51:15.520486 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-rmlls" event={"ID":"b3e2fedd-9ddb-453f-8c5d-255a16963ce0","Type":"ContainerDied","Data":"39f373ef322b45ef2ed9f8ed5ff93459091bfbbf73ee766cce03d076759a4a27"} Dec 01 10:51:15 crc kubenswrapper[4761]: I1201 10:51:15.520604 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-rmlls" event={"ID":"b3e2fedd-9ddb-453f-8c5d-255a16963ce0","Type":"ContainerStarted","Data":"9500fea2f24b1fe1d6c857919e95d0ba6314e1abbeb27411d73521d6683e35bb"} Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.841111 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.847674 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.964521 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-operator-scripts\") pod \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\" (UID: \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\") " Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.964621 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcqx4\" (UniqueName: \"kubernetes.io/projected/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-kube-api-access-rcqx4\") pod \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\" (UID: \"b3e2fedd-9ddb-453f-8c5d-255a16963ce0\") " Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.964686 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e870ec2-88bf-4afb-abf8-358bb03a5910-operator-scripts\") pod \"7e870ec2-88bf-4afb-abf8-358bb03a5910\" (UID: \"7e870ec2-88bf-4afb-abf8-358bb03a5910\") " Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.964750 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rns6p\" (UniqueName: \"kubernetes.io/projected/7e870ec2-88bf-4afb-abf8-358bb03a5910-kube-api-access-rns6p\") pod \"7e870ec2-88bf-4afb-abf8-358bb03a5910\" (UID: \"7e870ec2-88bf-4afb-abf8-358bb03a5910\") " Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.965906 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e870ec2-88bf-4afb-abf8-358bb03a5910-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e870ec2-88bf-4afb-abf8-358bb03a5910" (UID: "7e870ec2-88bf-4afb-abf8-358bb03a5910"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.965966 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b3e2fedd-9ddb-453f-8c5d-255a16963ce0" (UID: "b3e2fedd-9ddb-453f-8c5d-255a16963ce0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.973825 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-kube-api-access-rcqx4" (OuterVolumeSpecName: "kube-api-access-rcqx4") pod "b3e2fedd-9ddb-453f-8c5d-255a16963ce0" (UID: "b3e2fedd-9ddb-453f-8c5d-255a16963ce0"). InnerVolumeSpecName "kube-api-access-rcqx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:16 crc kubenswrapper[4761]: I1201 10:51:16.976441 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e870ec2-88bf-4afb-abf8-358bb03a5910-kube-api-access-rns6p" (OuterVolumeSpecName: "kube-api-access-rns6p") pod "7e870ec2-88bf-4afb-abf8-358bb03a5910" (UID: "7e870ec2-88bf-4afb-abf8-358bb03a5910"). InnerVolumeSpecName "kube-api-access-rns6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.066092 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rns6p\" (UniqueName: \"kubernetes.io/projected/7e870ec2-88bf-4afb-abf8-358bb03a5910-kube-api-access-rns6p\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.066129 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.066141 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcqx4\" (UniqueName: \"kubernetes.io/projected/b3e2fedd-9ddb-453f-8c5d-255a16963ce0-kube-api-access-rcqx4\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.066153 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e870ec2-88bf-4afb-abf8-358bb03a5910-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.537843 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.537840 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-81d0-account-create-update-7kqgw" event={"ID":"7e870ec2-88bf-4afb-abf8-358bb03a5910","Type":"ContainerDied","Data":"c2fbe1aa777c2306717d34b14f313be2d2d5b0051c19450e7cb4b5ba41b2f00c"} Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.538376 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2fbe1aa777c2306717d34b14f313be2d2d5b0051c19450e7cb4b5ba41b2f00c" Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.540724 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-rmlls" event={"ID":"b3e2fedd-9ddb-453f-8c5d-255a16963ce0","Type":"ContainerDied","Data":"9500fea2f24b1fe1d6c857919e95d0ba6314e1abbeb27411d73521d6683e35bb"} Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.540755 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9500fea2f24b1fe1d6c857919e95d0ba6314e1abbeb27411d73521d6683e35bb" Dec 01 10:51:17 crc kubenswrapper[4761]: I1201 10:51:17.540792 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-rmlls" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.154186 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-ngbs6"] Dec 01 10:51:19 crc kubenswrapper[4761]: E1201 10:51:19.154652 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e870ec2-88bf-4afb-abf8-358bb03a5910" containerName="mariadb-account-create-update" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.154674 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e870ec2-88bf-4afb-abf8-358bb03a5910" containerName="mariadb-account-create-update" Dec 01 10:51:19 crc kubenswrapper[4761]: E1201 10:51:19.154698 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e2fedd-9ddb-453f-8c5d-255a16963ce0" containerName="mariadb-database-create" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.154710 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e2fedd-9ddb-453f-8c5d-255a16963ce0" containerName="mariadb-database-create" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.154937 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e2fedd-9ddb-453f-8c5d-255a16963ce0" containerName="mariadb-database-create" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.154963 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e870ec2-88bf-4afb-abf8-358bb03a5910" containerName="mariadb-account-create-update" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.155744 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.160043 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-w84qq" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.160333 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.161403 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.172540 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ngbs6"] Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.199631 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-db-sync-config-data\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.199699 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-config-data\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.199781 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-combined-ca-bundle\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.199866 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rblnq\" (UniqueName: \"kubernetes.io/projected/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-kube-api-access-rblnq\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.300695 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-db-sync-config-data\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.300755 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-config-data\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.300797 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-combined-ca-bundle\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.300853 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rblnq\" (UniqueName: \"kubernetes.io/projected/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-kube-api-access-rblnq\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.305534 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-db-sync-config-data\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.315094 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-config-data\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.315443 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-combined-ca-bundle\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.317879 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rblnq\" (UniqueName: \"kubernetes.io/projected/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-kube-api-access-rblnq\") pod \"glance-db-sync-ngbs6\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.483953 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:19 crc kubenswrapper[4761]: I1201 10:51:19.999423 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ngbs6"] Dec 01 10:51:20 crc kubenswrapper[4761]: I1201 10:51:20.571702 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ngbs6" event={"ID":"d1219fbb-fb2c-481d-9364-19b3a0b52e4a","Type":"ContainerStarted","Data":"dc5e8423924636c631e2c8e920ac2be6d56ab98e4efccc5f1cc3fa410967f98e"} Dec 01 10:51:20 crc kubenswrapper[4761]: I1201 10:51:20.572095 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ngbs6" event={"ID":"d1219fbb-fb2c-481d-9364-19b3a0b52e4a","Type":"ContainerStarted","Data":"a4603ad91b04e3d2432d5dbe31089cb386c8b7ef9db4b233cddda967333dd870"} Dec 01 10:51:20 crc kubenswrapper[4761]: I1201 10:51:20.596091 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-ngbs6" podStartSLOduration=1.596064747 podStartE2EDuration="1.596064747s" podCreationTimestamp="2025-12-01 10:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:20.593061372 +0000 UTC m=+1219.896819996" watchObservedRunningTime="2025-12-01 10:51:20.596064747 +0000 UTC m=+1219.899823411" Dec 01 10:51:23 crc kubenswrapper[4761]: I1201 10:51:23.616796 4761 generic.go:334] "Generic (PLEG): container finished" podID="d1219fbb-fb2c-481d-9364-19b3a0b52e4a" containerID="dc5e8423924636c631e2c8e920ac2be6d56ab98e4efccc5f1cc3fa410967f98e" exitCode=0 Dec 01 10:51:23 crc kubenswrapper[4761]: I1201 10:51:23.616960 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ngbs6" event={"ID":"d1219fbb-fb2c-481d-9364-19b3a0b52e4a","Type":"ContainerDied","Data":"dc5e8423924636c631e2c8e920ac2be6d56ab98e4efccc5f1cc3fa410967f98e"} Dec 01 10:51:24 crc kubenswrapper[4761]: I1201 10:51:24.910286 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.018639 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-db-sync-config-data\") pod \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.018721 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-combined-ca-bundle\") pod \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.018766 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-config-data\") pod \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.018810 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rblnq\" (UniqueName: \"kubernetes.io/projected/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-kube-api-access-rblnq\") pod \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\" (UID: \"d1219fbb-fb2c-481d-9364-19b3a0b52e4a\") " Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.024807 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-kube-api-access-rblnq" (OuterVolumeSpecName: "kube-api-access-rblnq") pod "d1219fbb-fb2c-481d-9364-19b3a0b52e4a" (UID: "d1219fbb-fb2c-481d-9364-19b3a0b52e4a"). InnerVolumeSpecName "kube-api-access-rblnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.026321 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d1219fbb-fb2c-481d-9364-19b3a0b52e4a" (UID: "d1219fbb-fb2c-481d-9364-19b3a0b52e4a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.039371 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1219fbb-fb2c-481d-9364-19b3a0b52e4a" (UID: "d1219fbb-fb2c-481d-9364-19b3a0b52e4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.060169 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-config-data" (OuterVolumeSpecName: "config-data") pod "d1219fbb-fb2c-481d-9364-19b3a0b52e4a" (UID: "d1219fbb-fb2c-481d-9364-19b3a0b52e4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.120284 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.120316 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rblnq\" (UniqueName: \"kubernetes.io/projected/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-kube-api-access-rblnq\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.120327 4761 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.120337 4761 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1219fbb-fb2c-481d-9364-19b3a0b52e4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.633139 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ngbs6" event={"ID":"d1219fbb-fb2c-481d-9364-19b3a0b52e4a","Type":"ContainerDied","Data":"a4603ad91b04e3d2432d5dbe31089cb386c8b7ef9db4b233cddda967333dd870"} Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.633184 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4603ad91b04e3d2432d5dbe31089cb386c8b7ef9db4b233cddda967333dd870" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.633242 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ngbs6" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.987506 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:25 crc kubenswrapper[4761]: E1201 10:51:25.987850 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1219fbb-fb2c-481d-9364-19b3a0b52e4a" containerName="glance-db-sync" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.987865 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1219fbb-fb2c-481d-9364-19b3a0b52e4a" containerName="glance-db-sync" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.988025 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1219fbb-fb2c-481d-9364-19b3a0b52e4a" containerName="glance-db-sync" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.989051 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.991679 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.991983 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.992896 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.993521 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-w84qq" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.993798 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Dec 01 10:51:25 crc kubenswrapper[4761]: I1201 10:51:25.996222 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.006948 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033423 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033464 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033483 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033518 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-httpd-run\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033535 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033591 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-logs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033606 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-scripts\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033654 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5hdl\" (UniqueName: \"kubernetes.io/projected/f6da388a-8c26-450b-9217-ed89248ea76e-kube-api-access-s5hdl\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.033672 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-config-data\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.135581 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-logs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.135922 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-scripts\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.135982 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5hdl\" (UniqueName: \"kubernetes.io/projected/f6da388a-8c26-450b-9217-ed89248ea76e-kube-api-access-s5hdl\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136005 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-config-data\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136030 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-logs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136035 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136090 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136119 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136187 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-httpd-run\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136202 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136497 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.136780 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-httpd-run\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.140017 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-scripts\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.140271 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.140268 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.150902 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.151540 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5hdl\" (UniqueName: \"kubernetes.io/projected/f6da388a-8c26-450b-9217-ed89248ea76e-kube-api-access-s5hdl\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.151694 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-config-data\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.155372 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.306019 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.746011 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:26 crc kubenswrapper[4761]: I1201 10:51:26.844174 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:27 crc kubenswrapper[4761]: I1201 10:51:27.662215 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f6da388a-8c26-450b-9217-ed89248ea76e","Type":"ContainerStarted","Data":"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2"} Dec 01 10:51:27 crc kubenswrapper[4761]: I1201 10:51:27.662832 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f6da388a-8c26-450b-9217-ed89248ea76e","Type":"ContainerStarted","Data":"23fc649455d9ee94cf1b179f6f9a62cf71e85b827246f4ce96618e42eb605749"} Dec 01 10:51:28 crc kubenswrapper[4761]: I1201 10:51:28.677456 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f6da388a-8c26-450b-9217-ed89248ea76e","Type":"ContainerStarted","Data":"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c"} Dec 01 10:51:28 crc kubenswrapper[4761]: I1201 10:51:28.677767 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" containerName="glance-log" containerID="cri-o://6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2" gracePeriod=30 Dec 01 10:51:28 crc kubenswrapper[4761]: I1201 10:51:28.677819 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" containerName="glance-httpd" containerID="cri-o://84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c" gracePeriod=30 Dec 01 10:51:28 crc kubenswrapper[4761]: I1201 10:51:28.721857 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.721823998 podStartE2EDuration="3.721823998s" podCreationTimestamp="2025-12-01 10:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:28.707759822 +0000 UTC m=+1228.011518456" watchObservedRunningTime="2025-12-01 10:51:28.721823998 +0000 UTC m=+1228.025582702" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.222395 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.283852 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-internal-tls-certs\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.283945 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5hdl\" (UniqueName: \"kubernetes.io/projected/f6da388a-8c26-450b-9217-ed89248ea76e-kube-api-access-s5hdl\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.283980 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-scripts\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.284011 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-httpd-run\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.284029 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-logs\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.284079 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-config-data\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.284128 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-public-tls-certs\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.284152 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-combined-ca-bundle\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.284178 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"f6da388a-8c26-450b-9217-ed89248ea76e\" (UID: \"f6da388a-8c26-450b-9217-ed89248ea76e\") " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.284741 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.284764 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-logs" (OuterVolumeSpecName: "logs") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.290201 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-scripts" (OuterVolumeSpecName: "scripts") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.291217 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.292344 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6da388a-8c26-450b-9217-ed89248ea76e-kube-api-access-s5hdl" (OuterVolumeSpecName: "kube-api-access-s5hdl") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "kube-api-access-s5hdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.308498 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.331500 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.340781 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.346786 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-config-data" (OuterVolumeSpecName: "config-data") pod "f6da388a-8c26-450b-9217-ed89248ea76e" (UID: "f6da388a-8c26-450b-9217-ed89248ea76e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.385865 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.385913 4761 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.385926 4761 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.385967 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.385980 4761 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.385993 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5hdl\" (UniqueName: \"kubernetes.io/projected/f6da388a-8c26-450b-9217-ed89248ea76e-kube-api-access-s5hdl\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.386005 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6da388a-8c26-450b-9217-ed89248ea76e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.386016 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.386028 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6da388a-8c26-450b-9217-ed89248ea76e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.404068 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.488088 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.694342 4761 generic.go:334] "Generic (PLEG): container finished" podID="f6da388a-8c26-450b-9217-ed89248ea76e" containerID="84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c" exitCode=0 Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.694388 4761 generic.go:334] "Generic (PLEG): container finished" podID="f6da388a-8c26-450b-9217-ed89248ea76e" containerID="6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2" exitCode=143 Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.694399 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.694443 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f6da388a-8c26-450b-9217-ed89248ea76e","Type":"ContainerDied","Data":"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c"} Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.694523 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f6da388a-8c26-450b-9217-ed89248ea76e","Type":"ContainerDied","Data":"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2"} Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.694572 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f6da388a-8c26-450b-9217-ed89248ea76e","Type":"ContainerDied","Data":"23fc649455d9ee94cf1b179f6f9a62cf71e85b827246f4ce96618e42eb605749"} Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.694599 4761 scope.go:117] "RemoveContainer" containerID="84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.740538 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.748443 4761 scope.go:117] "RemoveContainer" containerID="6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.755159 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.779769 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:29 crc kubenswrapper[4761]: E1201 10:51:29.780135 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" containerName="glance-httpd" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.780149 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" containerName="glance-httpd" Dec 01 10:51:29 crc kubenswrapper[4761]: E1201 10:51:29.780164 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" containerName="glance-log" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.780173 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" containerName="glance-log" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.780356 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" containerName="glance-log" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.780369 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" containerName="glance-httpd" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.781237 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.808483 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.808712 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.808814 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.808851 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-w84qq" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.809061 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.809630 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.820473 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.821646 4761 scope.go:117] "RemoveContainer" containerID="84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c" Dec 01 10:51:29 crc kubenswrapper[4761]: E1201 10:51:29.822527 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c\": container with ID starting with 84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c not found: ID does not exist" containerID="84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.822602 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c"} err="failed to get container status \"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c\": rpc error: code = NotFound desc = could not find container \"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c\": container with ID starting with 84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c not found: ID does not exist" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.822649 4761 scope.go:117] "RemoveContainer" containerID="6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2" Dec 01 10:51:29 crc kubenswrapper[4761]: E1201 10:51:29.827311 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2\": container with ID starting with 6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2 not found: ID does not exist" containerID="6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.827362 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2"} err="failed to get container status \"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2\": rpc error: code = NotFound desc = could not find container \"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2\": container with ID starting with 6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2 not found: ID does not exist" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.827392 4761 scope.go:117] "RemoveContainer" containerID="84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.829009 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c"} err="failed to get container status \"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c\": rpc error: code = NotFound desc = could not find container \"84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c\": container with ID starting with 84475788633a9c2e1b6ec0ad9da47e141a0cb99585ce529e1edce48ae032a33c not found: ID does not exist" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.829049 4761 scope.go:117] "RemoveContainer" containerID="6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.829376 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2"} err="failed to get container status \"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2\": rpc error: code = NotFound desc = could not find container \"6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2\": container with ID starting with 6de0e97663ffdaf54d4b50114f33ae1386775e272c1b31396361456b7128f9e2 not found: ID does not exist" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894409 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-logs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894459 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-scripts\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894511 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbqt9\" (UniqueName: \"kubernetes.io/projected/1e914af0-822d-4e5c-baaf-309656eac6b0-kube-api-access-wbqt9\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894533 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894579 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-config-data\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894724 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894767 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894806 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-httpd-run\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.894857 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996585 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbqt9\" (UniqueName: \"kubernetes.io/projected/1e914af0-822d-4e5c-baaf-309656eac6b0-kube-api-access-wbqt9\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996642 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996675 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-config-data\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996711 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996733 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996764 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-httpd-run\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996801 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996842 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-logs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.996875 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-scripts\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.997710 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-httpd-run\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.998019 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:29 crc kubenswrapper[4761]: I1201 10:51:29.998222 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-logs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:30 crc kubenswrapper[4761]: I1201 10:51:30.000820 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-scripts\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:30 crc kubenswrapper[4761]: I1201 10:51:30.001088 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:30 crc kubenswrapper[4761]: I1201 10:51:30.002259 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:30 crc kubenswrapper[4761]: I1201 10:51:30.002885 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-config-data\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:30 crc kubenswrapper[4761]: I1201 10:51:30.004072 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:30 crc kubenswrapper[4761]: I1201 10:51:30.018168 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:30 crc kubenswrapper[4761]: I1201 10:51:30.018849 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbqt9\" (UniqueName: \"kubernetes.io/projected/1e914af0-822d-4e5c-baaf-309656eac6b0-kube-api-access-wbqt9\") pod \"glance-default-single-0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:30 crc kubenswrapper[4761]: I1201 10:51:30.171030 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:31 crc kubenswrapper[4761]: I1201 10:51:31.150370 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6da388a-8c26-450b-9217-ed89248ea76e" path="/var/lib/kubelet/pods/f6da388a-8c26-450b-9217-ed89248ea76e/volumes" Dec 01 10:51:31 crc kubenswrapper[4761]: I1201 10:51:31.590419 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:31 crc kubenswrapper[4761]: I1201 10:51:31.715993 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"1e914af0-822d-4e5c-baaf-309656eac6b0","Type":"ContainerStarted","Data":"748cffa62d7935e1e3760b5217c971adef19205312256c38fe201ce0788fd328"} Dec 01 10:51:32 crc kubenswrapper[4761]: I1201 10:51:32.725812 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"1e914af0-822d-4e5c-baaf-309656eac6b0","Type":"ContainerStarted","Data":"3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0"} Dec 01 10:51:33 crc kubenswrapper[4761]: I1201 10:51:33.739771 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"1e914af0-822d-4e5c-baaf-309656eac6b0","Type":"ContainerStarted","Data":"37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827"} Dec 01 10:51:33 crc kubenswrapper[4761]: I1201 10:51:33.791063 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=4.791025426 podStartE2EDuration="4.791025426s" podCreationTimestamp="2025-12-01 10:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:33.770495825 +0000 UTC m=+1233.074254489" watchObservedRunningTime="2025-12-01 10:51:33.791025426 +0000 UTC m=+1233.094784100" Dec 01 10:51:33 crc kubenswrapper[4761]: I1201 10:51:33.850500 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:51:33 crc kubenswrapper[4761]: I1201 10:51:33.850618 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:51:40 crc kubenswrapper[4761]: I1201 10:51:40.172091 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:40 crc kubenswrapper[4761]: I1201 10:51:40.174178 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:40 crc kubenswrapper[4761]: I1201 10:51:40.205212 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:40 crc kubenswrapper[4761]: I1201 10:51:40.245291 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:40 crc kubenswrapper[4761]: I1201 10:51:40.827069 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:40 crc kubenswrapper[4761]: I1201 10:51:40.827119 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:42 crc kubenswrapper[4761]: I1201 10:51:42.823423 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:42 crc kubenswrapper[4761]: I1201 10:51:42.835369 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.595796 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ngbs6"] Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.601377 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ngbs6"] Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.628062 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance81d0-account-delete-vg7ml"] Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.629126 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.638808 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance81d0-account-delete-vg7ml"] Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.683335 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.830580 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q64b\" (UniqueName: \"kubernetes.io/projected/37a95b49-1b97-479a-a330-7aef28ac59e4-kube-api-access-9q64b\") pod \"glance81d0-account-delete-vg7ml\" (UID: \"37a95b49-1b97-479a-a330-7aef28ac59e4\") " pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.830637 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a95b49-1b97-479a-a330-7aef28ac59e4-operator-scripts\") pod \"glance81d0-account-delete-vg7ml\" (UID: \"37a95b49-1b97-479a-a330-7aef28ac59e4\") " pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.860426 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-log" containerID="cri-o://3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0" gracePeriod=30 Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.860496 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-httpd" containerID="cri-o://37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827" gracePeriod=30 Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.866652 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.110:9292/healthcheck\": EOF" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.931656 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q64b\" (UniqueName: \"kubernetes.io/projected/37a95b49-1b97-479a-a330-7aef28ac59e4-kube-api-access-9q64b\") pod \"glance81d0-account-delete-vg7ml\" (UID: \"37a95b49-1b97-479a-a330-7aef28ac59e4\") " pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.931708 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a95b49-1b97-479a-a330-7aef28ac59e4-operator-scripts\") pod \"glance81d0-account-delete-vg7ml\" (UID: \"37a95b49-1b97-479a-a330-7aef28ac59e4\") " pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.932429 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a95b49-1b97-479a-a330-7aef28ac59e4-operator-scripts\") pod \"glance81d0-account-delete-vg7ml\" (UID: \"37a95b49-1b97-479a-a330-7aef28ac59e4\") " pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.958407 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q64b\" (UniqueName: \"kubernetes.io/projected/37a95b49-1b97-479a-a330-7aef28ac59e4-kube-api-access-9q64b\") pod \"glance81d0-account-delete-vg7ml\" (UID: \"37a95b49-1b97-479a-a330-7aef28ac59e4\") " pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:44 crc kubenswrapper[4761]: I1201 10:51:44.963602 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:45 crc kubenswrapper[4761]: I1201 10:51:45.142527 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1219fbb-fb2c-481d-9364-19b3a0b52e4a" path="/var/lib/kubelet/pods/d1219fbb-fb2c-481d-9364-19b3a0b52e4a/volumes" Dec 01 10:51:45 crc kubenswrapper[4761]: I1201 10:51:45.271294 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance81d0-account-delete-vg7ml"] Dec 01 10:51:45 crc kubenswrapper[4761]: W1201 10:51:45.283090 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37a95b49_1b97_479a_a330_7aef28ac59e4.slice/crio-7ba3eafba1a30e2b592e389f91480934bf23b623edc13fb88e424d5894554662 WatchSource:0}: Error finding container 7ba3eafba1a30e2b592e389f91480934bf23b623edc13fb88e424d5894554662: Status 404 returned error can't find the container with id 7ba3eafba1a30e2b592e389f91480934bf23b623edc13fb88e424d5894554662 Dec 01 10:51:45 crc kubenswrapper[4761]: I1201 10:51:45.869157 4761 generic.go:334] "Generic (PLEG): container finished" podID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerID="3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0" exitCode=143 Dec 01 10:51:45 crc kubenswrapper[4761]: I1201 10:51:45.869203 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"1e914af0-822d-4e5c-baaf-309656eac6b0","Type":"ContainerDied","Data":"3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0"} Dec 01 10:51:45 crc kubenswrapper[4761]: I1201 10:51:45.871221 4761 generic.go:334] "Generic (PLEG): container finished" podID="37a95b49-1b97-479a-a330-7aef28ac59e4" containerID="82014b7e289f2c16471309d7c778f0fea29c824bff1aeb3fdda39a5b7591f7c0" exitCode=0 Dec 01 10:51:45 crc kubenswrapper[4761]: I1201 10:51:45.871269 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" event={"ID":"37a95b49-1b97-479a-a330-7aef28ac59e4","Type":"ContainerDied","Data":"82014b7e289f2c16471309d7c778f0fea29c824bff1aeb3fdda39a5b7591f7c0"} Dec 01 10:51:45 crc kubenswrapper[4761]: I1201 10:51:45.871297 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" event={"ID":"37a95b49-1b97-479a-a330-7aef28ac59e4","Type":"ContainerStarted","Data":"7ba3eafba1a30e2b592e389f91480934bf23b623edc13fb88e424d5894554662"} Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.190109 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.371239 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q64b\" (UniqueName: \"kubernetes.io/projected/37a95b49-1b97-479a-a330-7aef28ac59e4-kube-api-access-9q64b\") pod \"37a95b49-1b97-479a-a330-7aef28ac59e4\" (UID: \"37a95b49-1b97-479a-a330-7aef28ac59e4\") " Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.371432 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a95b49-1b97-479a-a330-7aef28ac59e4-operator-scripts\") pod \"37a95b49-1b97-479a-a330-7aef28ac59e4\" (UID: \"37a95b49-1b97-479a-a330-7aef28ac59e4\") " Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.372363 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a95b49-1b97-479a-a330-7aef28ac59e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37a95b49-1b97-479a-a330-7aef28ac59e4" (UID: "37a95b49-1b97-479a-a330-7aef28ac59e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.378717 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a95b49-1b97-479a-a330-7aef28ac59e4-kube-api-access-9q64b" (OuterVolumeSpecName: "kube-api-access-9q64b") pod "37a95b49-1b97-479a-a330-7aef28ac59e4" (UID: "37a95b49-1b97-479a-a330-7aef28ac59e4"). InnerVolumeSpecName "kube-api-access-9q64b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.473172 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a95b49-1b97-479a-a330-7aef28ac59e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.473216 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q64b\" (UniqueName: \"kubernetes.io/projected/37a95b49-1b97-479a-a330-7aef28ac59e4-kube-api-access-9q64b\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.894868 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" event={"ID":"37a95b49-1b97-479a-a330-7aef28ac59e4","Type":"ContainerDied","Data":"7ba3eafba1a30e2b592e389f91480934bf23b623edc13fb88e424d5894554662"} Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.894924 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ba3eafba1a30e2b592e389f91480934bf23b623edc13fb88e424d5894554662" Dec 01 10:51:47 crc kubenswrapper[4761]: I1201 10:51:47.894929 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance81d0-account-delete-vg7ml" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.361711 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.486729 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-internal-tls-certs\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.486787 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-httpd-run\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.486805 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-scripts\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.486865 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-logs\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.486892 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-combined-ca-bundle\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.486909 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-config-data\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.486946 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbqt9\" (UniqueName: \"kubernetes.io/projected/1e914af0-822d-4e5c-baaf-309656eac6b0-kube-api-access-wbqt9\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.486979 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-public-tls-certs\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.487012 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"1e914af0-822d-4e5c-baaf-309656eac6b0\" (UID: \"1e914af0-822d-4e5c-baaf-309656eac6b0\") " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.487294 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.488960 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-logs" (OuterVolumeSpecName: "logs") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.492346 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e914af0-822d-4e5c-baaf-309656eac6b0-kube-api-access-wbqt9" (OuterVolumeSpecName: "kube-api-access-wbqt9") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "kube-api-access-wbqt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.493719 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-scripts" (OuterVolumeSpecName: "scripts") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.494165 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.510282 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.524238 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-config-data" (OuterVolumeSpecName: "config-data") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.524611 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.551307 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1e914af0-822d-4e5c-baaf-309656eac6b0" (UID: "1e914af0-822d-4e5c-baaf-309656eac6b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.588717 4761 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.589081 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.589097 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.589114 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e914af0-822d-4e5c-baaf-309656eac6b0-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.589127 4761 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.589140 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.589151 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbqt9\" (UniqueName: \"kubernetes.io/projected/1e914af0-822d-4e5c-baaf-309656eac6b0-kube-api-access-wbqt9\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.589164 4761 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e914af0-822d-4e5c-baaf-309656eac6b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.589202 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.601872 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.691021 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.910822 4761 generic.go:334] "Generic (PLEG): container finished" podID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerID="37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827" exitCode=0 Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.910903 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"1e914af0-822d-4e5c-baaf-309656eac6b0","Type":"ContainerDied","Data":"37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827"} Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.910951 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"1e914af0-822d-4e5c-baaf-309656eac6b0","Type":"ContainerDied","Data":"748cffa62d7935e1e3760b5217c971adef19205312256c38fe201ce0788fd328"} Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.910980 4761 scope.go:117] "RemoveContainer" containerID="37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.911181 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.981022 4761 scope.go:117] "RemoveContainer" containerID="3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0" Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.985731 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:48 crc kubenswrapper[4761]: I1201 10:51:48.997886 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.001765 4761 scope.go:117] "RemoveContainer" containerID="37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827" Dec 01 10:51:49 crc kubenswrapper[4761]: E1201 10:51:49.002184 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827\": container with ID starting with 37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827 not found: ID does not exist" containerID="37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827" Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.002322 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827"} err="failed to get container status \"37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827\": rpc error: code = NotFound desc = could not find container \"37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827\": container with ID starting with 37ed50c9626e1bb892ebb86bd4c37fd12b6c01527a7184911653cc8d56ccf827 not found: ID does not exist" Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.002419 4761 scope.go:117] "RemoveContainer" containerID="3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0" Dec 01 10:51:49 crc kubenswrapper[4761]: E1201 10:51:49.003033 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0\": container with ID starting with 3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0 not found: ID does not exist" containerID="3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0" Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.003062 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0"} err="failed to get container status \"3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0\": rpc error: code = NotFound desc = could not find container \"3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0\": container with ID starting with 3e44b2e79755d443f67b08e1a2cf81238eded45080e5c69155a796cf0960c0e0 not found: ID does not exist" Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.144523 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" path="/var/lib/kubelet/pods/1e914af0-822d-4e5c-baaf-309656eac6b0/volumes" Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.662860 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-rmlls"] Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.676269 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-rmlls"] Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.693155 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance81d0-account-delete-vg7ml"] Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.703723 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-81d0-account-create-update-7kqgw"] Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.712170 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance81d0-account-delete-vg7ml"] Dec 01 10:51:49 crc kubenswrapper[4761]: I1201 10:51:49.718657 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-81d0-account-create-update-7kqgw"] Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.113185 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-cm2gp"] Dec 01 10:51:50 crc kubenswrapper[4761]: E1201 10:51:50.113679 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-log" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.113711 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-log" Dec 01 10:51:50 crc kubenswrapper[4761]: E1201 10:51:50.113735 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a95b49-1b97-479a-a330-7aef28ac59e4" containerName="mariadb-account-delete" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.113747 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a95b49-1b97-479a-a330-7aef28ac59e4" containerName="mariadb-account-delete" Dec 01 10:51:50 crc kubenswrapper[4761]: E1201 10:51:50.113764 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-httpd" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.113773 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-httpd" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.114014 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-httpd" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.114038 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a95b49-1b97-479a-a330-7aef28ac59e4" containerName="mariadb-account-delete" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.114060 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e914af0-822d-4e5c-baaf-309656eac6b0" containerName="glance-log" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.114837 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.126148 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2clwc\" (UniqueName: \"kubernetes.io/projected/06997588-fb51-4b96-9f3a-22910179ca36-kube-api-access-2clwc\") pod \"glance-db-create-cm2gp\" (UID: \"06997588-fb51-4b96-9f3a-22910179ca36\") " pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.126246 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06997588-fb51-4b96-9f3a-22910179ca36-operator-scripts\") pod \"glance-db-create-cm2gp\" (UID: \"06997588-fb51-4b96-9f3a-22910179ca36\") " pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.126435 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-065c-account-create-update-kspzx"] Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.128955 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.133057 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.135008 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-cm2gp"] Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.143659 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-065c-account-create-update-kspzx"] Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.227617 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2clwc\" (UniqueName: \"kubernetes.io/projected/06997588-fb51-4b96-9f3a-22910179ca36-kube-api-access-2clwc\") pod \"glance-db-create-cm2gp\" (UID: \"06997588-fb51-4b96-9f3a-22910179ca36\") " pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.227713 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ddb4-e098-4eb8-9173-946c5b080b42-operator-scripts\") pod \"glance-065c-account-create-update-kspzx\" (UID: \"8724ddb4-e098-4eb8-9173-946c5b080b42\") " pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.227753 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06997588-fb51-4b96-9f3a-22910179ca36-operator-scripts\") pod \"glance-db-create-cm2gp\" (UID: \"06997588-fb51-4b96-9f3a-22910179ca36\") " pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.228433 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06997588-fb51-4b96-9f3a-22910179ca36-operator-scripts\") pod \"glance-db-create-cm2gp\" (UID: \"06997588-fb51-4b96-9f3a-22910179ca36\") " pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.228851 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvp46\" (UniqueName: \"kubernetes.io/projected/8724ddb4-e098-4eb8-9173-946c5b080b42-kube-api-access-fvp46\") pod \"glance-065c-account-create-update-kspzx\" (UID: \"8724ddb4-e098-4eb8-9173-946c5b080b42\") " pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.248618 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2clwc\" (UniqueName: \"kubernetes.io/projected/06997588-fb51-4b96-9f3a-22910179ca36-kube-api-access-2clwc\") pod \"glance-db-create-cm2gp\" (UID: \"06997588-fb51-4b96-9f3a-22910179ca36\") " pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.329526 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvp46\" (UniqueName: \"kubernetes.io/projected/8724ddb4-e098-4eb8-9173-946c5b080b42-kube-api-access-fvp46\") pod \"glance-065c-account-create-update-kspzx\" (UID: \"8724ddb4-e098-4eb8-9173-946c5b080b42\") " pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.329626 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ddb4-e098-4eb8-9173-946c5b080b42-operator-scripts\") pod \"glance-065c-account-create-update-kspzx\" (UID: \"8724ddb4-e098-4eb8-9173-946c5b080b42\") " pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.332419 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ddb4-e098-4eb8-9173-946c5b080b42-operator-scripts\") pod \"glance-065c-account-create-update-kspzx\" (UID: \"8724ddb4-e098-4eb8-9173-946c5b080b42\") " pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.345572 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvp46\" (UniqueName: \"kubernetes.io/projected/8724ddb4-e098-4eb8-9173-946c5b080b42-kube-api-access-fvp46\") pod \"glance-065c-account-create-update-kspzx\" (UID: \"8724ddb4-e098-4eb8-9173-946c5b080b42\") " pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.437280 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.449476 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:50 crc kubenswrapper[4761]: I1201 10:51:50.974111 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-cm2gp"] Dec 01 10:51:50 crc kubenswrapper[4761]: W1201 10:51:50.982636 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06997588_fb51_4b96_9f3a_22910179ca36.slice/crio-fc3ffdae4c6ba71f2f443d04e597ed2fa8749d8a55bf4914b404b4caad98fdd9 WatchSource:0}: Error finding container fc3ffdae4c6ba71f2f443d04e597ed2fa8749d8a55bf4914b404b4caad98fdd9: Status 404 returned error can't find the container with id fc3ffdae4c6ba71f2f443d04e597ed2fa8749d8a55bf4914b404b4caad98fdd9 Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.072148 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-065c-account-create-update-kspzx"] Dec 01 10:51:51 crc kubenswrapper[4761]: W1201 10:51:51.086740 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8724ddb4_e098_4eb8_9173_946c5b080b42.slice/crio-0aa006d41bce24ac72635ab2ba9b3245965202b596d861eb0438a088a1b21515 WatchSource:0}: Error finding container 0aa006d41bce24ac72635ab2ba9b3245965202b596d861eb0438a088a1b21515: Status 404 returned error can't find the container with id 0aa006d41bce24ac72635ab2ba9b3245965202b596d861eb0438a088a1b21515 Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.143602 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a95b49-1b97-479a-a330-7aef28ac59e4" path="/var/lib/kubelet/pods/37a95b49-1b97-479a-a330-7aef28ac59e4/volumes" Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.144428 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e870ec2-88bf-4afb-abf8-358bb03a5910" path="/var/lib/kubelet/pods/7e870ec2-88bf-4afb-abf8-358bb03a5910/volumes" Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.145213 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e2fedd-9ddb-453f-8c5d-255a16963ce0" path="/var/lib/kubelet/pods/b3e2fedd-9ddb-453f-8c5d-255a16963ce0/volumes" Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.953208 4761 generic.go:334] "Generic (PLEG): container finished" podID="8724ddb4-e098-4eb8-9173-946c5b080b42" containerID="071b7b461dd3bd9ce1a872230779f43f6f52506d4126e6f67ffb65fccb4cd80c" exitCode=0 Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.953267 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" event={"ID":"8724ddb4-e098-4eb8-9173-946c5b080b42","Type":"ContainerDied","Data":"071b7b461dd3bd9ce1a872230779f43f6f52506d4126e6f67ffb65fccb4cd80c"} Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.953492 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" event={"ID":"8724ddb4-e098-4eb8-9173-946c5b080b42","Type":"ContainerStarted","Data":"0aa006d41bce24ac72635ab2ba9b3245965202b596d861eb0438a088a1b21515"} Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.956245 4761 generic.go:334] "Generic (PLEG): container finished" podID="06997588-fb51-4b96-9f3a-22910179ca36" containerID="f94c946c761272a768bb936be1dec51e638919f72fc9071f22de317a2831e55f" exitCode=0 Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.956312 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-cm2gp" event={"ID":"06997588-fb51-4b96-9f3a-22910179ca36","Type":"ContainerDied","Data":"f94c946c761272a768bb936be1dec51e638919f72fc9071f22de317a2831e55f"} Dec 01 10:51:51 crc kubenswrapper[4761]: I1201 10:51:51.956351 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-cm2gp" event={"ID":"06997588-fb51-4b96-9f3a-22910179ca36","Type":"ContainerStarted","Data":"fc3ffdae4c6ba71f2f443d04e597ed2fa8749d8a55bf4914b404b4caad98fdd9"} Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.382031 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.469858 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.498663 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ddb4-e098-4eb8-9173-946c5b080b42-operator-scripts\") pod \"8724ddb4-e098-4eb8-9173-946c5b080b42\" (UID: \"8724ddb4-e098-4eb8-9173-946c5b080b42\") " Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.498792 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvp46\" (UniqueName: \"kubernetes.io/projected/8724ddb4-e098-4eb8-9173-946c5b080b42-kube-api-access-fvp46\") pod \"8724ddb4-e098-4eb8-9173-946c5b080b42\" (UID: \"8724ddb4-e098-4eb8-9173-946c5b080b42\") " Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.499522 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8724ddb4-e098-4eb8-9173-946c5b080b42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8724ddb4-e098-4eb8-9173-946c5b080b42" (UID: "8724ddb4-e098-4eb8-9173-946c5b080b42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.508256 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8724ddb4-e098-4eb8-9173-946c5b080b42-kube-api-access-fvp46" (OuterVolumeSpecName: "kube-api-access-fvp46") pod "8724ddb4-e098-4eb8-9173-946c5b080b42" (UID: "8724ddb4-e098-4eb8-9173-946c5b080b42"). InnerVolumeSpecName "kube-api-access-fvp46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.599799 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2clwc\" (UniqueName: \"kubernetes.io/projected/06997588-fb51-4b96-9f3a-22910179ca36-kube-api-access-2clwc\") pod \"06997588-fb51-4b96-9f3a-22910179ca36\" (UID: \"06997588-fb51-4b96-9f3a-22910179ca36\") " Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.599976 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06997588-fb51-4b96-9f3a-22910179ca36-operator-scripts\") pod \"06997588-fb51-4b96-9f3a-22910179ca36\" (UID: \"06997588-fb51-4b96-9f3a-22910179ca36\") " Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.600432 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvp46\" (UniqueName: \"kubernetes.io/projected/8724ddb4-e098-4eb8-9173-946c5b080b42-kube-api-access-fvp46\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.600462 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ddb4-e098-4eb8-9173-946c5b080b42-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.601058 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06997588-fb51-4b96-9f3a-22910179ca36-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06997588-fb51-4b96-9f3a-22910179ca36" (UID: "06997588-fb51-4b96-9f3a-22910179ca36"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.604092 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06997588-fb51-4b96-9f3a-22910179ca36-kube-api-access-2clwc" (OuterVolumeSpecName: "kube-api-access-2clwc") pod "06997588-fb51-4b96-9f3a-22910179ca36" (UID: "06997588-fb51-4b96-9f3a-22910179ca36"). InnerVolumeSpecName "kube-api-access-2clwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.702040 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2clwc\" (UniqueName: \"kubernetes.io/projected/06997588-fb51-4b96-9f3a-22910179ca36-kube-api-access-2clwc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.702105 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06997588-fb51-4b96-9f3a-22910179ca36-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.975754 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-cm2gp" event={"ID":"06997588-fb51-4b96-9f3a-22910179ca36","Type":"ContainerDied","Data":"fc3ffdae4c6ba71f2f443d04e597ed2fa8749d8a55bf4914b404b4caad98fdd9"} Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.975803 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc3ffdae4c6ba71f2f443d04e597ed2fa8749d8a55bf4914b404b4caad98fdd9" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.975822 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-cm2gp" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.978416 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" event={"ID":"8724ddb4-e098-4eb8-9173-946c5b080b42","Type":"ContainerDied","Data":"0aa006d41bce24ac72635ab2ba9b3245965202b596d861eb0438a088a1b21515"} Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.978501 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa006d41bce24ac72635ab2ba9b3245965202b596d861eb0438a088a1b21515" Dec 01 10:51:53 crc kubenswrapper[4761]: I1201 10:51:53.978606 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-065c-account-create-update-kspzx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.432388 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-27bkx"] Dec 01 10:51:55 crc kubenswrapper[4761]: E1201 10:51:55.434182 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8724ddb4-e098-4eb8-9173-946c5b080b42" containerName="mariadb-account-create-update" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.434312 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="8724ddb4-e098-4eb8-9173-946c5b080b42" containerName="mariadb-account-create-update" Dec 01 10:51:55 crc kubenswrapper[4761]: E1201 10:51:55.434424 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06997588-fb51-4b96-9f3a-22910179ca36" containerName="mariadb-database-create" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.434526 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="06997588-fb51-4b96-9f3a-22910179ca36" containerName="mariadb-database-create" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.434923 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="8724ddb4-e098-4eb8-9173-946c5b080b42" containerName="mariadb-account-create-update" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.435061 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="06997588-fb51-4b96-9f3a-22910179ca36" containerName="mariadb-database-create" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.435767 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.439173 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-h228n" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.439249 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.444722 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-27bkx"] Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.528007 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pzhg\" (UniqueName: \"kubernetes.io/projected/f53292ab-0210-41f8-a531-9ad4e8d0cffc-kube-api-access-9pzhg\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.528256 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-config-data\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.528481 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-db-sync-config-data\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.629760 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-config-data\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.629856 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-db-sync-config-data\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.629891 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pzhg\" (UniqueName: \"kubernetes.io/projected/f53292ab-0210-41f8-a531-9ad4e8d0cffc-kube-api-access-9pzhg\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.636245 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-config-data\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.639238 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-db-sync-config-data\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.666584 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pzhg\" (UniqueName: \"kubernetes.io/projected/f53292ab-0210-41f8-a531-9ad4e8d0cffc-kube-api-access-9pzhg\") pod \"glance-db-sync-27bkx\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:55 crc kubenswrapper[4761]: I1201 10:51:55.762941 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:51:56 crc kubenswrapper[4761]: I1201 10:51:56.264490 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-27bkx"] Dec 01 10:51:57 crc kubenswrapper[4761]: I1201 10:51:57.006537 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-27bkx" event={"ID":"f53292ab-0210-41f8-a531-9ad4e8d0cffc","Type":"ContainerStarted","Data":"6813bcf8145b42d9a5fabc2f4130bb22ec0a387f8da58b71b59ad3e27e2fd514"} Dec 01 10:51:57 crc kubenswrapper[4761]: I1201 10:51:57.006853 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-27bkx" event={"ID":"f53292ab-0210-41f8-a531-9ad4e8d0cffc","Type":"ContainerStarted","Data":"1a31fef44fee70272ac638736da5415d99f1751f6d95c436d44b0ef3119c69aa"} Dec 01 10:51:57 crc kubenswrapper[4761]: I1201 10:51:57.028516 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-27bkx" podStartSLOduration=2.028498978 podStartE2EDuration="2.028498978s" podCreationTimestamp="2025-12-01 10:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:57.025234698 +0000 UTC m=+1256.328993322" watchObservedRunningTime="2025-12-01 10:51:57.028498978 +0000 UTC m=+1256.332257612" Dec 01 10:52:00 crc kubenswrapper[4761]: I1201 10:52:00.030258 4761 generic.go:334] "Generic (PLEG): container finished" podID="f53292ab-0210-41f8-a531-9ad4e8d0cffc" containerID="6813bcf8145b42d9a5fabc2f4130bb22ec0a387f8da58b71b59ad3e27e2fd514" exitCode=0 Dec 01 10:52:00 crc kubenswrapper[4761]: I1201 10:52:00.030355 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-27bkx" event={"ID":"f53292ab-0210-41f8-a531-9ad4e8d0cffc","Type":"ContainerDied","Data":"6813bcf8145b42d9a5fabc2f4130bb22ec0a387f8da58b71b59ad3e27e2fd514"} Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.352917 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.435172 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pzhg\" (UniqueName: \"kubernetes.io/projected/f53292ab-0210-41f8-a531-9ad4e8d0cffc-kube-api-access-9pzhg\") pod \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.435230 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-db-sync-config-data\") pod \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.435293 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-config-data\") pod \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\" (UID: \"f53292ab-0210-41f8-a531-9ad4e8d0cffc\") " Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.440927 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53292ab-0210-41f8-a531-9ad4e8d0cffc-kube-api-access-9pzhg" (OuterVolumeSpecName: "kube-api-access-9pzhg") pod "f53292ab-0210-41f8-a531-9ad4e8d0cffc" (UID: "f53292ab-0210-41f8-a531-9ad4e8d0cffc"). InnerVolumeSpecName "kube-api-access-9pzhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.446294 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f53292ab-0210-41f8-a531-9ad4e8d0cffc" (UID: "f53292ab-0210-41f8-a531-9ad4e8d0cffc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.484349 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-config-data" (OuterVolumeSpecName: "config-data") pod "f53292ab-0210-41f8-a531-9ad4e8d0cffc" (UID: "f53292ab-0210-41f8-a531-9ad4e8d0cffc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.537283 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pzhg\" (UniqueName: \"kubernetes.io/projected/f53292ab-0210-41f8-a531-9ad4e8d0cffc-kube-api-access-9pzhg\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.537350 4761 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:01 crc kubenswrapper[4761]: I1201 10:52:01.537361 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53292ab-0210-41f8-a531-9ad4e8d0cffc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:02 crc kubenswrapper[4761]: I1201 10:52:02.049200 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-27bkx" event={"ID":"f53292ab-0210-41f8-a531-9ad4e8d0cffc","Type":"ContainerDied","Data":"1a31fef44fee70272ac638736da5415d99f1751f6d95c436d44b0ef3119c69aa"} Dec 01 10:52:02 crc kubenswrapper[4761]: I1201 10:52:02.049295 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a31fef44fee70272ac638736da5415d99f1751f6d95c436d44b0ef3119c69aa" Dec 01 10:52:02 crc kubenswrapper[4761]: I1201 10:52:02.049242 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-27bkx" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.310033 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:52:03 crc kubenswrapper[4761]: E1201 10:52:03.310640 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53292ab-0210-41f8-a531-9ad4e8d0cffc" containerName="glance-db-sync" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.310656 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53292ab-0210-41f8-a531-9ad4e8d0cffc" containerName="glance-db-sync" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.310822 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53292ab-0210-41f8-a531-9ad4e8d0cffc" containerName="glance-db-sync" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.312096 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.314456 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-h228n" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.314459 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.314964 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.333370 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.361074 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462638 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462683 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462706 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462733 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-run\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462770 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462796 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-dev\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462816 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rgr\" (UniqueName: \"kubernetes.io/projected/d64d16c7-7040-44fa-a124-e8dde25108fd-kube-api-access-69rgr\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462852 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462907 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462923 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462951 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-logs\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.462965 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-sys\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.463017 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.463049 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.463312 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.482771 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.514539 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.516303 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.518352 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.532232 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564016 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564062 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564090 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-logs\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564108 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-sys\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564138 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564174 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564195 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564190 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564231 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564197 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-sys\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564213 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564297 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564300 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564317 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-run\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564338 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-run\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564354 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564376 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-dev\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564414 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69rgr\" (UniqueName: \"kubernetes.io/projected/d64d16c7-7040-44fa-a124-e8dde25108fd-kube-api-access-69rgr\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564435 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564444 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-dev\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564474 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.564852 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.565061 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-logs\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.569671 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.571765 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.588071 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69rgr\" (UniqueName: \"kubernetes.io/projected/d64d16c7-7040-44fa-a124-e8dde25108fd-kube-api-access-69rgr\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.593670 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.627854 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665325 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-dev\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665363 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665391 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-sys\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665408 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665443 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-run\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665463 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665481 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665503 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665518 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665541 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lwjm\" (UniqueName: \"kubernetes.io/projected/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-kube-api-access-9lwjm\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665579 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665610 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665627 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.665650 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767296 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767669 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-dev\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767687 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767710 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-sys\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767727 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767763 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-run\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767789 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767807 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767835 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767832 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-dev\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767855 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.767924 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-sys\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768028 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768039 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-run\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768059 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768212 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768358 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768469 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lwjm\" (UniqueName: \"kubernetes.io/projected/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-kube-api-access-9lwjm\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768507 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768537 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768597 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768664 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768701 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768719 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.768811 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.773485 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.775279 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.787435 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lwjm\" (UniqueName: \"kubernetes.io/projected/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-kube-api-access-9lwjm\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.794247 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.798208 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.850381 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.850456 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.850514 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.863886 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab11ccfedd2eeac2b7c4c9c4adbffd2e76c15b3f5230acf3c51b97fe7e1ab0cf"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.864004 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://ab11ccfedd2eeac2b7c4c9c4adbffd2e76c15b3f5230acf3c51b97fe7e1ab0cf" gracePeriod=600 Dec 01 10:52:03 crc kubenswrapper[4761]: I1201 10:52:03.864741 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:04 crc kubenswrapper[4761]: I1201 10:52:04.072943 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="ab11ccfedd2eeac2b7c4c9c4adbffd2e76c15b3f5230acf3c51b97fe7e1ab0cf" exitCode=0 Dec 01 10:52:04 crc kubenswrapper[4761]: I1201 10:52:04.073323 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"ab11ccfedd2eeac2b7c4c9c4adbffd2e76c15b3f5230acf3c51b97fe7e1ab0cf"} Dec 01 10:52:04 crc kubenswrapper[4761]: I1201 10:52:04.073382 4761 scope.go:117] "RemoveContainer" containerID="d30d5344481323b43a7d255c5c2b5f71119019ddc6b979360df65b87253e34d5" Dec 01 10:52:04 crc kubenswrapper[4761]: I1201 10:52:04.074605 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:52:04 crc kubenswrapper[4761]: I1201 10:52:04.267605 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:04 crc kubenswrapper[4761]: I1201 10:52:04.311782 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:04 crc kubenswrapper[4761]: W1201 10:52:04.318529 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d51a29b_12f9_4ba1_a67d_3dcff8a00cec.slice/crio-336b05d989d5e64ce68493aaad132f4baabba650f71327f9943e6e75427dfd3e WatchSource:0}: Error finding container 336b05d989d5e64ce68493aaad132f4baabba650f71327f9943e6e75427dfd3e: Status 404 returned error can't find the container with id 336b05d989d5e64ce68493aaad132f4baabba650f71327f9943e6e75427dfd3e Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.085264 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d64d16c7-7040-44fa-a124-e8dde25108fd","Type":"ContainerStarted","Data":"a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.085820 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d64d16c7-7040-44fa-a124-e8dde25108fd","Type":"ContainerStarted","Data":"a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.085835 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d64d16c7-7040-44fa-a124-e8dde25108fd","Type":"ContainerStarted","Data":"fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.085844 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d64d16c7-7040-44fa-a124-e8dde25108fd","Type":"ContainerStarted","Data":"2e4445fe3fad073b2628b1effc84a4aa35ddfe00d4dade841ac9734c3e6b53b0"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.088157 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"7d57787b78893daee12ca3c7dcee8cb3520b06bd08aeb0d4d1cb8f9e5545ff08"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.091037 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec","Type":"ContainerStarted","Data":"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.091090 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec","Type":"ContainerStarted","Data":"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.091105 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec","Type":"ContainerStarted","Data":"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.091119 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec","Type":"ContainerStarted","Data":"336b05d989d5e64ce68493aaad132f4baabba650f71327f9943e6e75427dfd3e"} Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.091244 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-log" containerID="cri-o://b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d" gracePeriod=30 Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.091277 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-api" containerID="cri-o://102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832" gracePeriod=30 Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.091370 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-httpd" containerID="cri-o://59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964" gracePeriod=30 Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.115122 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.11509923 podStartE2EDuration="2.11509923s" podCreationTimestamp="2025-12-01 10:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:05.114377741 +0000 UTC m=+1264.418136375" watchObservedRunningTime="2025-12-01 10:52:05.11509923 +0000 UTC m=+1264.418857864" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.149458 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.149436118 podStartE2EDuration="3.149436118s" podCreationTimestamp="2025-12-01 10:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:05.144430941 +0000 UTC m=+1264.448189565" watchObservedRunningTime="2025-12-01 10:52:05.149436118 +0000 UTC m=+1264.453194742" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.491671 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600428 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-config-data\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600498 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-var-locks-brick\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600526 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-logs\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600559 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600569 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600625 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-sys\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600656 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600687 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-httpd-run\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600720 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-run\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600734 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-iscsi\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600752 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-scripts\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600768 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-lib-modules\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600782 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-nvme\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600799 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-dev\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600816 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lwjm\" (UniqueName: \"kubernetes.io/projected/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-kube-api-access-9lwjm\") pod \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\" (UID: \"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec\") " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.600993 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-logs" (OuterVolumeSpecName: "logs") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.601020 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-run" (OuterVolumeSpecName: "run") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.601092 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.601105 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.601114 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.601293 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.603312 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-sys" (OuterVolumeSpecName: "sys") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.603386 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.603412 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.603438 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-dev" (OuterVolumeSpecName: "dev") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.603794 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.606875 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.607100 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.607833 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-scripts" (OuterVolumeSpecName: "scripts") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.611066 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-kube-api-access-9lwjm" (OuterVolumeSpecName: "kube-api-access-9lwjm") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "kube-api-access-9lwjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.675017 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-config-data" (OuterVolumeSpecName: "config-data") pod "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" (UID: "3d51a29b-12f9-4ba1-a67d-3dcff8a00cec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702004 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702033 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702041 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702049 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702058 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702066 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702076 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lwjm\" (UniqueName: \"kubernetes.io/projected/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-kube-api-access-9lwjm\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702085 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702119 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702129 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.702141 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.717967 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.718535 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.804126 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:05 crc kubenswrapper[4761]: I1201 10:52:05.804153 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101124 4761 generic.go:334] "Generic (PLEG): container finished" podID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerID="102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832" exitCode=143 Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101372 4761 generic.go:334] "Generic (PLEG): container finished" podID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerID="59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964" exitCode=143 Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101381 4761 generic.go:334] "Generic (PLEG): container finished" podID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerID="b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d" exitCode=143 Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101188 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec","Type":"ContainerDied","Data":"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832"} Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101170 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101464 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec","Type":"ContainerDied","Data":"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964"} Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101476 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec","Type":"ContainerDied","Data":"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d"} Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101485 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3d51a29b-12f9-4ba1-a67d-3dcff8a00cec","Type":"ContainerDied","Data":"336b05d989d5e64ce68493aaad132f4baabba650f71327f9943e6e75427dfd3e"} Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.101500 4761 scope.go:117] "RemoveContainer" containerID="102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.129164 4761 scope.go:117] "RemoveContainer" containerID="59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.136003 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.142024 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.155944 4761 scope.go:117] "RemoveContainer" containerID="b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.160651 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:06 crc kubenswrapper[4761]: E1201 10:52:06.160955 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-httpd" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.160972 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-httpd" Dec 01 10:52:06 crc kubenswrapper[4761]: E1201 10:52:06.160988 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-api" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.160995 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-api" Dec 01 10:52:06 crc kubenswrapper[4761]: E1201 10:52:06.161016 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-log" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.161022 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-log" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.161177 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-log" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.161194 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-httpd" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.161207 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" containerName="glance-api" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.162281 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.164146 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.175856 4761 scope.go:117] "RemoveContainer" containerID="102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832" Dec 01 10:52:06 crc kubenswrapper[4761]: E1201 10:52:06.176371 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832\": container with ID starting with 102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832 not found: ID does not exist" containerID="102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.176414 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832"} err="failed to get container status \"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832\": rpc error: code = NotFound desc = could not find container \"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832\": container with ID starting with 102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832 not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.176439 4761 scope.go:117] "RemoveContainer" containerID="59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964" Dec 01 10:52:06 crc kubenswrapper[4761]: E1201 10:52:06.178891 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964\": container with ID starting with 59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964 not found: ID does not exist" containerID="59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.178923 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964"} err="failed to get container status \"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964\": rpc error: code = NotFound desc = could not find container \"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964\": container with ID starting with 59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964 not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.178952 4761 scope.go:117] "RemoveContainer" containerID="b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d" Dec 01 10:52:06 crc kubenswrapper[4761]: E1201 10:52:06.179929 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d\": container with ID starting with b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d not found: ID does not exist" containerID="b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.179968 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d"} err="failed to get container status \"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d\": rpc error: code = NotFound desc = could not find container \"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d\": container with ID starting with b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.179985 4761 scope.go:117] "RemoveContainer" containerID="102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.180312 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832"} err="failed to get container status \"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832\": rpc error: code = NotFound desc = could not find container \"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832\": container with ID starting with 102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832 not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.180342 4761 scope.go:117] "RemoveContainer" containerID="59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.180595 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964"} err="failed to get container status \"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964\": rpc error: code = NotFound desc = could not find container \"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964\": container with ID starting with 59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964 not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.180621 4761 scope.go:117] "RemoveContainer" containerID="b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.180859 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d"} err="failed to get container status \"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d\": rpc error: code = NotFound desc = could not find container \"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d\": container with ID starting with b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.180883 4761 scope.go:117] "RemoveContainer" containerID="102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.181126 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832"} err="failed to get container status \"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832\": rpc error: code = NotFound desc = could not find container \"102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832\": container with ID starting with 102db851e5ced1d273a4903055ce3004494b1bbfa3e08b09e281102668465832 not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.181153 4761 scope.go:117] "RemoveContainer" containerID="59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.181325 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964"} err="failed to get container status \"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964\": rpc error: code = NotFound desc = could not find container \"59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964\": container with ID starting with 59ca9b6d624a5cb98e195a4194f572f7924ce34e87a146e60af20f7a43810964 not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.181343 4761 scope.go:117] "RemoveContainer" containerID="b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.181550 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d"} err="failed to get container status \"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d\": rpc error: code = NotFound desc = could not find container \"b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d\": container with ID starting with b0bb0657807b8f6ea4da30283468df04aa10ea13e9d9e8921b21e253f7d24b5d not found: ID does not exist" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.190955 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.326898 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.326993 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327032 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327059 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327091 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327253 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327311 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76c6c\" (UniqueName: \"kubernetes.io/projected/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-kube-api-access-76c6c\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327504 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-dev\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327585 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327612 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327694 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-run\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327753 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327792 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.327823 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-sys\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.429027 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-run\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.429093 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.429123 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.429191 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.429191 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-run\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437415 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437479 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-sys\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437491 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-sys\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437628 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437680 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437727 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437752 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437796 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437822 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.437938 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.438844 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.438884 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.439012 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76c6c\" (UniqueName: \"kubernetes.io/projected/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-kube-api-access-76c6c\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.439148 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-dev\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.439183 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.439208 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.439309 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.439341 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.439398 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-dev\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.439541 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.445976 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.446666 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.463390 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.465121 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76c6c\" (UniqueName: \"kubernetes.io/projected/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-kube-api-access-76c6c\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.465800 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.480118 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:06 crc kubenswrapper[4761]: I1201 10:52:06.951428 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:06 crc kubenswrapper[4761]: W1201 10:52:06.962977 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4877a9b_bec0_4d49_9c5a_9890c75a3a6e.slice/crio-99189e156412f3b829310e7f180edeef00cc8b631b6a4c39429f635a504201d0 WatchSource:0}: Error finding container 99189e156412f3b829310e7f180edeef00cc8b631b6a4c39429f635a504201d0: Status 404 returned error can't find the container with id 99189e156412f3b829310e7f180edeef00cc8b631b6a4c39429f635a504201d0 Dec 01 10:52:07 crc kubenswrapper[4761]: I1201 10:52:07.115537 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e","Type":"ContainerStarted","Data":"99189e156412f3b829310e7f180edeef00cc8b631b6a4c39429f635a504201d0"} Dec 01 10:52:07 crc kubenswrapper[4761]: I1201 10:52:07.152929 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d51a29b-12f9-4ba1-a67d-3dcff8a00cec" path="/var/lib/kubelet/pods/3d51a29b-12f9-4ba1-a67d-3dcff8a00cec/volumes" Dec 01 10:52:07 crc kubenswrapper[4761]: I1201 10:52:07.480033 4761 scope.go:117] "RemoveContainer" containerID="f69657d67215d66ceae2163fd9ed2037605a7ee0c95be88575e662c63b990596" Dec 01 10:52:07 crc kubenswrapper[4761]: I1201 10:52:07.566845 4761 scope.go:117] "RemoveContainer" containerID="4b89e21038509d9f6c9aa07f5e7c1ac0af6fc3d2efc1a57641fbb258fd747fdc" Dec 01 10:52:07 crc kubenswrapper[4761]: I1201 10:52:07.604828 4761 scope.go:117] "RemoveContainer" containerID="cc7ab45718aaf09a3398d6dba5d78e63fe3d82a58c1f0816a3c02a7eecb61e99" Dec 01 10:52:08 crc kubenswrapper[4761]: I1201 10:52:08.124224 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e","Type":"ContainerStarted","Data":"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d"} Dec 01 10:52:08 crc kubenswrapper[4761]: I1201 10:52:08.124589 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e","Type":"ContainerStarted","Data":"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8"} Dec 01 10:52:08 crc kubenswrapper[4761]: I1201 10:52:08.124605 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e","Type":"ContainerStarted","Data":"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e"} Dec 01 10:52:08 crc kubenswrapper[4761]: I1201 10:52:08.168446 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.168423738 podStartE2EDuration="2.168423738s" podCreationTimestamp="2025-12-01 10:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:08.1608023 +0000 UTC m=+1267.464560924" watchObservedRunningTime="2025-12-01 10:52:08.168423738 +0000 UTC m=+1267.472182382" Dec 01 10:52:13 crc kubenswrapper[4761]: I1201 10:52:13.628155 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:13 crc kubenswrapper[4761]: I1201 10:52:13.629113 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:13 crc kubenswrapper[4761]: I1201 10:52:13.629141 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:13 crc kubenswrapper[4761]: I1201 10:52:13.658600 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:13 crc kubenswrapper[4761]: I1201 10:52:13.660373 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:13 crc kubenswrapper[4761]: I1201 10:52:13.674672 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:14 crc kubenswrapper[4761]: I1201 10:52:14.171011 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:14 crc kubenswrapper[4761]: I1201 10:52:14.171060 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:14 crc kubenswrapper[4761]: I1201 10:52:14.171073 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:14 crc kubenswrapper[4761]: I1201 10:52:14.190159 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:14 crc kubenswrapper[4761]: I1201 10:52:14.198307 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:14 crc kubenswrapper[4761]: I1201 10:52:14.205051 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:16 crc kubenswrapper[4761]: I1201 10:52:16.481801 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:16 crc kubenswrapper[4761]: I1201 10:52:16.482174 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:16 crc kubenswrapper[4761]: I1201 10:52:16.482194 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:16 crc kubenswrapper[4761]: I1201 10:52:16.519309 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:16 crc kubenswrapper[4761]: I1201 10:52:16.521396 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:16 crc kubenswrapper[4761]: I1201 10:52:16.545330 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:17 crc kubenswrapper[4761]: I1201 10:52:17.203175 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:17 crc kubenswrapper[4761]: I1201 10:52:17.203271 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:17 crc kubenswrapper[4761]: I1201 10:52:17.203299 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:17 crc kubenswrapper[4761]: I1201 10:52:17.218670 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:17 crc kubenswrapper[4761]: I1201 10:52:17.228718 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:17 crc kubenswrapper[4761]: I1201 10:52:17.230686 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.143607 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.145817 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.145958 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.147322 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.158134 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.213728 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261007 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-dev\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261054 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwv9l\" (UniqueName: \"kubernetes.io/projected/b4bb9102-620e-46bf-8e38-a1c74a28d07f-kube-api-access-lwv9l\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261084 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-sys\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261108 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261132 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261184 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261221 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261244 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-logs\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261267 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-sys\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261288 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-logs\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261324 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-dev\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261348 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-config-data\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261375 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261399 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261500 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj69\" (UniqueName: \"kubernetes.io/projected/4f13a9fe-98e8-4536-8c58-5c53d37be913-kube-api-access-zkj69\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261617 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261699 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261734 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-scripts\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261770 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-scripts\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261814 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-run\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.261850 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.262081 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.262135 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-config-data\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.262175 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-run\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.262217 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.262312 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.262362 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.262433 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.303858 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.305295 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.319967 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.321479 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.330247 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.330298 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366416 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366478 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366497 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-logs\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366516 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-sys\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366529 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-logs\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366573 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-dev\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366593 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-config-data\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366613 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366630 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366650 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkj69\" (UniqueName: \"kubernetes.io/projected/4f13a9fe-98e8-4536-8c58-5c53d37be913-kube-api-access-zkj69\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366665 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366683 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366698 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-scripts\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366714 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-scripts\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366729 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-run\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366743 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366762 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366778 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-config-data\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366797 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-run\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366814 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366810 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366836 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366859 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366877 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366900 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366929 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-dev\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366947 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwv9l\" (UniqueName: \"kubernetes.io/projected/b4bb9102-620e-46bf-8e38-a1c74a28d07f-kube-api-access-lwv9l\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366965 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-sys\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366980 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366998 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.367084 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.367197 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.367317 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.367456 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.367491 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.367808 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-logs\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.367858 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-sys\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.368211 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-logs\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.368265 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-dev\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.368960 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.375637 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.366833 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.376062 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.376113 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-run\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.376155 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.376377 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-dev\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.376418 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.376448 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-run\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.377459 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-sys\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.377540 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.377690 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.380515 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-scripts\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.382426 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-config-data\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.399177 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-scripts\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.403222 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-config-data\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.408262 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkj69\" (UniqueName: \"kubernetes.io/projected/4f13a9fe-98e8-4536-8c58-5c53d37be913-kube-api-access-zkj69\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.409254 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwv9l\" (UniqueName: \"kubernetes.io/projected/b4bb9102-620e-46bf-8e38-a1c74a28d07f-kube-api-access-lwv9l\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.421648 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.425486 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.429608 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-2\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.440859 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.468641 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.468709 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.468752 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.468779 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-config-data\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.468854 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-scripts\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.468918 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.468939 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-logs\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.468994 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.469012 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-dev\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.469253 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-sys\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.469343 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-run\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.469386 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.469669 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.469697 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hgnv\" (UniqueName: \"kubernetes.io/projected/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-kube-api-access-4hgnv\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.469952 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.493514 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.503113 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.505602 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.571811 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hgnv\" (UniqueName: \"kubernetes.io/projected/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-kube-api-access-4hgnv\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.571883 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.571915 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.571943 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.571974 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-sys\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572007 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572041 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-config-data\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572072 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572114 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-scripts\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572151 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572205 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572232 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-logs\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572266 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572302 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572333 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-scripts\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572367 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572400 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-config-data\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572434 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-dev\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572480 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8j7s\" (UniqueName: \"kubernetes.io/projected/712c7b85-7cb5-410a-9585-76642a5f47b4-kube-api-access-b8j7s\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572514 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-sys\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572567 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-run\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572595 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-dev\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572676 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-run\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572711 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572742 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572767 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-logs\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.572796 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.573432 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.573483 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.573686 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.574643 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-dev\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.574710 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-sys\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.574772 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.575257 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-run\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.575426 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-logs\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.575437 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.575694 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.579606 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-config-data\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.579850 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-scripts\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.596747 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hgnv\" (UniqueName: \"kubernetes.io/projected/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-kube-api-access-4hgnv\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.606295 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-2\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.674861 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-sys\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.674913 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.674946 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.674972 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675005 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675026 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675040 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-scripts\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675060 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-config-data\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675088 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8j7s\" (UniqueName: \"kubernetes.io/projected/712c7b85-7cb5-410a-9585-76642a5f47b4-kube-api-access-b8j7s\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675082 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675106 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-run\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675121 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-dev\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675160 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-dev\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675188 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675209 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.674979 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-sys\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675364 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-run\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675367 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675404 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675409 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675585 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-logs\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675613 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.675808 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.676229 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.677740 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-logs\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.677891 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.684131 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-scripts\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.684275 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-config-data\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.691817 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8j7s\" (UniqueName: \"kubernetes.io/projected/712c7b85-7cb5-410a-9585-76642a5f47b4-kube-api-access-b8j7s\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.703741 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.704016 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.799957 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:19 crc kubenswrapper[4761]: W1201 10:52:19.905833 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb63d18a_acf5_4088_8616_5ccb8ab1d8f5.slice/crio-495d144267ebb59e329ad0dc4af47479208a928e79d04388bbe29cd006c769bc WatchSource:0}: Error finding container 495d144267ebb59e329ad0dc4af47479208a928e79d04388bbe29cd006c769bc: Status 404 returned error can't find the container with id 495d144267ebb59e329ad0dc4af47479208a928e79d04388bbe29cd006c769bc Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.913891 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:52:19 crc kubenswrapper[4761]: I1201 10:52:19.964776 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:52:19 crc kubenswrapper[4761]: W1201 10:52:19.978726 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4bb9102_620e_46bf_8e38_a1c74a28d07f.slice/crio-0d0be37dcb1ca6daf85b39e5608e9d4149ce90b80c778f53e8c2e9cabc2e4e06 WatchSource:0}: Error finding container 0d0be37dcb1ca6daf85b39e5608e9d4149ce90b80c778f53e8c2e9cabc2e4e06: Status 404 returned error can't find the container with id 0d0be37dcb1ca6daf85b39e5608e9d4149ce90b80c778f53e8c2e9cabc2e4e06 Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.016916 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.226335 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:52:20 crc kubenswrapper[4761]: W1201 10:52:20.230964 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod712c7b85_7cb5_410a_9585_76642a5f47b4.slice/crio-b11d9e181d2d4e580d1d6365992635822c20c2a4c01b45c72530dca4d13bb2f4 WatchSource:0}: Error finding container b11d9e181d2d4e580d1d6365992635822c20c2a4c01b45c72530dca4d13bb2f4: Status 404 returned error can't find the container with id b11d9e181d2d4e580d1d6365992635822c20c2a4c01b45c72530dca4d13bb2f4 Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.243346 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5","Type":"ContainerStarted","Data":"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010"} Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.243387 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5","Type":"ContainerStarted","Data":"495d144267ebb59e329ad0dc4af47479208a928e79d04388bbe29cd006c769bc"} Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.247347 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"712c7b85-7cb5-410a-9585-76642a5f47b4","Type":"ContainerStarted","Data":"b11d9e181d2d4e580d1d6365992635822c20c2a4c01b45c72530dca4d13bb2f4"} Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.248709 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"b4bb9102-620e-46bf-8e38-a1c74a28d07f","Type":"ContainerStarted","Data":"cd3bcf6b5bfce0e339d5be08c27f296ce63b3207a61ffe06152d8cebceaac132"} Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.248735 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"b4bb9102-620e-46bf-8e38-a1c74a28d07f","Type":"ContainerStarted","Data":"0d0be37dcb1ca6daf85b39e5608e9d4149ce90b80c778f53e8c2e9cabc2e4e06"} Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.250688 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"4f13a9fe-98e8-4536-8c58-5c53d37be913","Type":"ContainerStarted","Data":"54b54b9ac95ce8961922abf317ed0b36b5384d05df0019c5e7bbaa28df245766"} Dec 01 10:52:20 crc kubenswrapper[4761]: I1201 10:52:20.250710 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"4f13a9fe-98e8-4536-8c58-5c53d37be913","Type":"ContainerStarted","Data":"51f1379759d0fa2461b7fb0890f06150fb0f1dce593333da04fd6a366dba19e0"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.260466 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5","Type":"ContainerStarted","Data":"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.261143 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5","Type":"ContainerStarted","Data":"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.264542 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"712c7b85-7cb5-410a-9585-76642a5f47b4","Type":"ContainerStarted","Data":"056f1ab4ff432d9d267ecee55cdacce22018ab78ea96dd3a33c199f46dadc47b"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.264668 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"712c7b85-7cb5-410a-9585-76642a5f47b4","Type":"ContainerStarted","Data":"ca955628fe520877c5c313b2a1272bb50267383a04fc5679a6690781694afe07"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.264696 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"712c7b85-7cb5-410a-9585-76642a5f47b4","Type":"ContainerStarted","Data":"79987e4329a555b177342a97ce764599c7a7d6a4211a49e91d0a5136d5c2ccc6"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.267743 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"b4bb9102-620e-46bf-8e38-a1c74a28d07f","Type":"ContainerStarted","Data":"e190a46e819c3f689542faf666438aab73b76f686caf515dcf68f2e22c122ec1"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.267827 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"b4bb9102-620e-46bf-8e38-a1c74a28d07f","Type":"ContainerStarted","Data":"a7b7cefa9540783b0187aae6ed75244956de55095071fa47d946bb962991d746"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.270251 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"4f13a9fe-98e8-4536-8c58-5c53d37be913","Type":"ContainerStarted","Data":"1c6a990202225e0d1072ee4fe15c2ed8021d664206b0c672ca571b9a69cdf226"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.270297 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"4f13a9fe-98e8-4536-8c58-5c53d37be913","Type":"ContainerStarted","Data":"74626c4abd0c6a7fcbb0b96d57faf6e14cb26c44baa26b44ab53dad1fd638f86"} Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.311184 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.31116324 podStartE2EDuration="3.31116324s" podCreationTimestamp="2025-12-01 10:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:21.309722671 +0000 UTC m=+1280.613481305" watchObservedRunningTime="2025-12-01 10:52:21.31116324 +0000 UTC m=+1280.614921884" Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.349271 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.3492574299999998 podStartE2EDuration="3.34925743s" podCreationTimestamp="2025-12-01 10:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:21.347384749 +0000 UTC m=+1280.651143393" watchObservedRunningTime="2025-12-01 10:52:21.34925743 +0000 UTC m=+1280.653016054" Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.386137 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.386100587 podStartE2EDuration="3.386100587s" podCreationTimestamp="2025-12-01 10:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:21.373682087 +0000 UTC m=+1280.677440721" watchObservedRunningTime="2025-12-01 10:52:21.386100587 +0000 UTC m=+1280.689859211" Dec 01 10:52:21 crc kubenswrapper[4761]: I1201 10:52:21.414929 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.414906123 podStartE2EDuration="3.414906123s" podCreationTimestamp="2025-12-01 10:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:21.414840911 +0000 UTC m=+1280.718599545" watchObservedRunningTime="2025-12-01 10:52:21.414906123 +0000 UTC m=+1280.718664747" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.495185 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.495592 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.495631 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.506844 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.506999 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.507009 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.529641 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.551009 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.551265 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.551581 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.562352 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.570758 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.679501 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.679588 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.679602 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.708199 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.711491 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.716302 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.800463 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.800826 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.800842 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.823759 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.828131 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:29 crc kubenswrapper[4761]: I1201 10:52:29.833761 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342830 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342881 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342896 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342907 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342918 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342931 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342944 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342955 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342966 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342977 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342988 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.342998 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.360107 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.360244 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.360417 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.360997 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.362051 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.365099 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.365444 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.366502 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.366823 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.369650 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.373910 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:30 crc kubenswrapper[4761]: I1201 10:52:30.379912 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:31 crc kubenswrapper[4761]: I1201 10:52:31.799655 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:52:31 crc kubenswrapper[4761]: I1201 10:52:31.811301 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:52:31 crc kubenswrapper[4761]: I1201 10:52:31.967199 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:52:31 crc kubenswrapper[4761]: I1201 10:52:31.992658 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.369641 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-log" containerID="cri-o://2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.369762 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-api" containerID="cri-o://635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.370217 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-log" containerID="cri-o://54b54b9ac95ce8961922abf317ed0b36b5384d05df0019c5e7bbaa28df245766" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.369984 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-log" containerID="cri-o://79987e4329a555b177342a97ce764599c7a7d6a4211a49e91d0a5136d5c2ccc6" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.370039 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-api" containerID="cri-o://056f1ab4ff432d9d267ecee55cdacce22018ab78ea96dd3a33c199f46dadc47b" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.370083 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-httpd" containerID="cri-o://ca955628fe520877c5c313b2a1272bb50267383a04fc5679a6690781694afe07" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.369839 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-httpd" containerID="cri-o://a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.370309 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-httpd" containerID="cri-o://74626c4abd0c6a7fcbb0b96d57faf6e14cb26c44baa26b44ab53dad1fd638f86" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.370301 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-api" containerID="cri-o://1c6a990202225e0d1072ee4fe15c2ed8021d664206b0c672ca571b9a69cdf226" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.370527 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-log" containerID="cri-o://cd3bcf6b5bfce0e339d5be08c27f296ce63b3207a61ffe06152d8cebceaac132" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.370620 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-httpd" containerID="cri-o://a7b7cefa9540783b0187aae6ed75244956de55095071fa47d946bb962991d746" gracePeriod=30 Dec 01 10:52:33 crc kubenswrapper[4761]: I1201 10:52:33.370690 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-api" containerID="cri-o://e190a46e819c3f689542faf666438aab73b76f686caf515dcf68f2e22c122ec1" gracePeriod=30 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.355755 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.399090 4761 generic.go:334] "Generic (PLEG): container finished" podID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerID="056f1ab4ff432d9d267ecee55cdacce22018ab78ea96dd3a33c199f46dadc47b" exitCode=0 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.399136 4761 generic.go:334] "Generic (PLEG): container finished" podID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerID="ca955628fe520877c5c313b2a1272bb50267383a04fc5679a6690781694afe07" exitCode=0 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.399153 4761 generic.go:334] "Generic (PLEG): container finished" podID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerID="79987e4329a555b177342a97ce764599c7a7d6a4211a49e91d0a5136d5c2ccc6" exitCode=143 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.399260 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"712c7b85-7cb5-410a-9585-76642a5f47b4","Type":"ContainerDied","Data":"056f1ab4ff432d9d267ecee55cdacce22018ab78ea96dd3a33c199f46dadc47b"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.399298 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"712c7b85-7cb5-410a-9585-76642a5f47b4","Type":"ContainerDied","Data":"ca955628fe520877c5c313b2a1272bb50267383a04fc5679a6690781694afe07"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.399315 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"712c7b85-7cb5-410a-9585-76642a5f47b4","Type":"ContainerDied","Data":"79987e4329a555b177342a97ce764599c7a7d6a4211a49e91d0a5136d5c2ccc6"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.402441 4761 generic.go:334] "Generic (PLEG): container finished" podID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerID="e190a46e819c3f689542faf666438aab73b76f686caf515dcf68f2e22c122ec1" exitCode=0 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.402477 4761 generic.go:334] "Generic (PLEG): container finished" podID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerID="a7b7cefa9540783b0187aae6ed75244956de55095071fa47d946bb962991d746" exitCode=0 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.402489 4761 generic.go:334] "Generic (PLEG): container finished" podID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerID="cd3bcf6b5bfce0e339d5be08c27f296ce63b3207a61ffe06152d8cebceaac132" exitCode=143 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.402523 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"b4bb9102-620e-46bf-8e38-a1c74a28d07f","Type":"ContainerDied","Data":"e190a46e819c3f689542faf666438aab73b76f686caf515dcf68f2e22c122ec1"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.402599 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"b4bb9102-620e-46bf-8e38-a1c74a28d07f","Type":"ContainerDied","Data":"a7b7cefa9540783b0187aae6ed75244956de55095071fa47d946bb962991d746"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.402614 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"b4bb9102-620e-46bf-8e38-a1c74a28d07f","Type":"ContainerDied","Data":"cd3bcf6b5bfce0e339d5be08c27f296ce63b3207a61ffe06152d8cebceaac132"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.408748 4761 generic.go:334] "Generic (PLEG): container finished" podID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerID="1c6a990202225e0d1072ee4fe15c2ed8021d664206b0c672ca571b9a69cdf226" exitCode=0 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.409017 4761 generic.go:334] "Generic (PLEG): container finished" podID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerID="74626c4abd0c6a7fcbb0b96d57faf6e14cb26c44baa26b44ab53dad1fd638f86" exitCode=0 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.409112 4761 generic.go:334] "Generic (PLEG): container finished" podID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerID="54b54b9ac95ce8961922abf317ed0b36b5384d05df0019c5e7bbaa28df245766" exitCode=143 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.409237 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"4f13a9fe-98e8-4536-8c58-5c53d37be913","Type":"ContainerDied","Data":"1c6a990202225e0d1072ee4fe15c2ed8021d664206b0c672ca571b9a69cdf226"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.409346 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"4f13a9fe-98e8-4536-8c58-5c53d37be913","Type":"ContainerDied","Data":"74626c4abd0c6a7fcbb0b96d57faf6e14cb26c44baa26b44ab53dad1fd638f86"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.409430 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"4f13a9fe-98e8-4536-8c58-5c53d37be913","Type":"ContainerDied","Data":"54b54b9ac95ce8961922abf317ed0b36b5384d05df0019c5e7bbaa28df245766"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413046 4761 generic.go:334] "Generic (PLEG): container finished" podID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerID="635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051" exitCode=0 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413127 4761 generic.go:334] "Generic (PLEG): container finished" podID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerID="a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177" exitCode=0 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413141 4761 generic.go:334] "Generic (PLEG): container finished" podID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerID="2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010" exitCode=143 Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413201 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5","Type":"ContainerDied","Data":"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413267 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5","Type":"ContainerDied","Data":"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413286 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5","Type":"ContainerDied","Data":"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413303 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5","Type":"ContainerDied","Data":"495d144267ebb59e329ad0dc4af47479208a928e79d04388bbe29cd006c769bc"} Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413326 4761 scope.go:117] "RemoveContainer" containerID="635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.413665 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.432915 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-logs\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.432965 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.432986 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-iscsi\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433009 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hgnv\" (UniqueName: \"kubernetes.io/projected/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-kube-api-access-4hgnv\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433065 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-httpd-run\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433089 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-var-locks-brick\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433117 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-scripts\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433152 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-run\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433190 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-sys\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433227 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-nvme\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433240 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433271 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-config-data\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433294 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-lib-modules\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433314 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-dev\") pod \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\" (UID: \"fb63d18a-acf5-4088-8616-5ccb8ab1d8f5\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433289 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-logs" (OuterVolumeSpecName: "logs") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433494 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-run" (OuterVolumeSpecName: "run") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433511 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433639 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-dev" (OuterVolumeSpecName: "dev") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433659 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-sys" (OuterVolumeSpecName: "sys") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.433674 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.434379 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.434653 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.434907 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.437491 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.439469 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-scripts" (OuterVolumeSpecName: "scripts") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.441410 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.444735 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.446467 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.450223 4761 scope.go:117] "RemoveContainer" containerID="a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.450601 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-kube-api-access-4hgnv" (OuterVolumeSpecName: "kube-api-access-4hgnv") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "kube-api-access-4hgnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.487513 4761 scope.go:117] "RemoveContainer" containerID="2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.505245 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.534997 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-lib-modules\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535031 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-dev\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535061 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-logs\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535101 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-iscsi\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535124 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-scripts\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535164 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-scripts\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535185 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkj69\" (UniqueName: \"kubernetes.io/projected/4f13a9fe-98e8-4536-8c58-5c53d37be913-kube-api-access-zkj69\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535222 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-run\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535254 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-config-data\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535271 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-config-data\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535293 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535311 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-sys\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535326 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-nvme\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535345 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535359 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-run\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535373 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535399 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-nvme\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535413 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-lib-modules\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535428 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-var-locks-brick\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535442 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-iscsi\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535466 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-httpd-run\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535486 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-var-locks-brick\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535510 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-logs\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535529 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-sys\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535560 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwv9l\" (UniqueName: \"kubernetes.io/projected/b4bb9102-620e-46bf-8e38-a1c74a28d07f-kube-api-access-lwv9l\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535574 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-dev\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535586 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\" (UID: \"b4bb9102-620e-46bf-8e38-a1c74a28d07f\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535603 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-httpd-run\") pod \"4f13a9fe-98e8-4536-8c58-5c53d37be913\" (UID: \"4f13a9fe-98e8-4536-8c58-5c53d37be913\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535867 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535878 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535899 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535908 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535917 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535926 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535941 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535952 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535963 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hgnv\" (UniqueName: \"kubernetes.io/projected/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-kube-api-access-4hgnv\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535972 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535980 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535173 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-dev" (OuterVolumeSpecName: "dev") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535990 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-logs" (OuterVolumeSpecName: "logs") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535989 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.536048 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535205 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535776 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535815 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535832 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535849 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535861 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.535874 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.536037 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-run" (OuterVolumeSpecName: "run") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.536117 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.536151 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-dev" (OuterVolumeSpecName: "dev") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.536350 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.536477 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-logs" (OuterVolumeSpecName: "logs") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.536565 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-sys" (OuterVolumeSpecName: "sys") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.537271 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.539717 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-sys" (OuterVolumeSpecName: "sys") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.543196 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-run" (OuterVolumeSpecName: "run") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.549216 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.554355 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-scripts" (OuterVolumeSpecName: "scripts") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.554364 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.554399 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-scripts" (OuterVolumeSpecName: "scripts") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.554367 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bb9102-620e-46bf-8e38-a1c74a28d07f-kube-api-access-lwv9l" (OuterVolumeSpecName: "kube-api-access-lwv9l") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "kube-api-access-lwv9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.563575 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.563821 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-config-data" (OuterVolumeSpecName: "config-data") pod "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" (UID: "fb63d18a-acf5-4088-8616-5ccb8ab1d8f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.573766 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.573923 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f13a9fe-98e8-4536-8c58-5c53d37be913-kube-api-access-zkj69" (OuterVolumeSpecName: "kube-api-access-zkj69") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "kube-api-access-zkj69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.581494 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.595161 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.630308 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-config-data" (OuterVolumeSpecName: "config-data") pod "4f13a9fe-98e8-4536-8c58-5c53d37be913" (UID: "4f13a9fe-98e8-4536-8c58-5c53d37be913"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636648 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-httpd-run\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636681 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-var-locks-brick\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636706 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636721 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-dev\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636739 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8j7s\" (UniqueName: \"kubernetes.io/projected/712c7b85-7cb5-410a-9585-76642a5f47b4-kube-api-access-b8j7s\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636777 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-config-data\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636806 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636830 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-scripts\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636848 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-lib-modules\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636861 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-run\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636889 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-logs\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636949 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-nvme\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.636965 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-iscsi\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637009 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-sys\") pod \"712c7b85-7cb5-410a-9585-76642a5f47b4\" (UID: \"712c7b85-7cb5-410a-9585-76642a5f47b4\") " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637270 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637287 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637304 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637314 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637323 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637334 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637346 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637354 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637362 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637372 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637380 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637388 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637396 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637404 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637412 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637419 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637427 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637434 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637442 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwv9l\" (UniqueName: \"kubernetes.io/projected/b4bb9102-620e-46bf-8e38-a1c74a28d07f-kube-api-access-lwv9l\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637450 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4f13a9fe-98e8-4536-8c58-5c53d37be913-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637461 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637469 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f13a9fe-98e8-4536-8c58-5c53d37be913-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637477 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637485 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637493 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb9102-620e-46bf-8e38-a1c74a28d07f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637501 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637510 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b4bb9102-620e-46bf-8e38-a1c74a28d07f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637518 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f13a9fe-98e8-4536-8c58-5c53d37be913-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637527 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkj69\" (UniqueName: \"kubernetes.io/projected/4f13a9fe-98e8-4536-8c58-5c53d37be913-kube-api-access-zkj69\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.637534 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.638282 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.638684 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-config-data" (OuterVolumeSpecName: "config-data") pod "b4bb9102-620e-46bf-8e38-a1c74a28d07f" (UID: "b4bb9102-620e-46bf-8e38-a1c74a28d07f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.638669 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.639065 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.639151 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-run" (OuterVolumeSpecName: "run") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.639404 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.639453 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.639482 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-logs" (OuterVolumeSpecName: "logs") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.639510 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-sys" (OuterVolumeSpecName: "sys") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.639984 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-dev" (OuterVolumeSpecName: "dev") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.640793 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.641227 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712c7b85-7cb5-410a-9585-76642a5f47b4-kube-api-access-b8j7s" (OuterVolumeSpecName: "kube-api-access-b8j7s") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "kube-api-access-b8j7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.641606 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-scripts" (OuterVolumeSpecName: "scripts") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.642602 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.652858 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.652904 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.652951 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.665624 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.700637 4761 scope.go:117] "RemoveContainer" containerID="635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051" Dec 01 10:52:34 crc kubenswrapper[4761]: E1201 10:52:34.700979 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051\": container with ID starting with 635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051 not found: ID does not exist" containerID="635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.701099 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051"} err="failed to get container status \"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051\": rpc error: code = NotFound desc = could not find container \"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051\": container with ID starting with 635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.701178 4761 scope.go:117] "RemoveContainer" containerID="a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177" Dec 01 10:52:34 crc kubenswrapper[4761]: E1201 10:52:34.701418 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177\": container with ID starting with a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177 not found: ID does not exist" containerID="a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.701438 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177"} err="failed to get container status \"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177\": rpc error: code = NotFound desc = could not find container \"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177\": container with ID starting with a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.701452 4761 scope.go:117] "RemoveContainer" containerID="2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010" Dec 01 10:52:34 crc kubenswrapper[4761]: E1201 10:52:34.701740 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010\": container with ID starting with 2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010 not found: ID does not exist" containerID="2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.701815 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010"} err="failed to get container status \"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010\": rpc error: code = NotFound desc = could not find container \"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010\": container with ID starting with 2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.701926 4761 scope.go:117] "RemoveContainer" containerID="635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.702388 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051"} err="failed to get container status \"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051\": rpc error: code = NotFound desc = could not find container \"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051\": container with ID starting with 635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.702443 4761 scope.go:117] "RemoveContainer" containerID="a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.702718 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177"} err="failed to get container status \"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177\": rpc error: code = NotFound desc = could not find container \"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177\": container with ID starting with a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.702805 4761 scope.go:117] "RemoveContainer" containerID="2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.703322 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010"} err="failed to get container status \"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010\": rpc error: code = NotFound desc = could not find container \"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010\": container with ID starting with 2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.703360 4761 scope.go:117] "RemoveContainer" containerID="635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.703859 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051"} err="failed to get container status \"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051\": rpc error: code = NotFound desc = could not find container \"635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051\": container with ID starting with 635e9cba3c5ecd97030b166f32b473118b1b6974d90a653c7103206604316051 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.703895 4761 scope.go:117] "RemoveContainer" containerID="a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.704140 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177"} err="failed to get container status \"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177\": rpc error: code = NotFound desc = could not find container \"a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177\": container with ID starting with a1ae115ab4572548f780f274a8e1ab33feb8d9ddfbb6216dcd9a901ccdc35177 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.704211 4761 scope.go:117] "RemoveContainer" containerID="2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.704529 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010"} err="failed to get container status \"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010\": rpc error: code = NotFound desc = could not find container \"2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010\": container with ID starting with 2afacc76a8e34d676dba393fce22d431a3dc1ec889cfeb84cef18e4010fb4010 not found: ID does not exist" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.718587 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-config-data" (OuterVolumeSpecName: "config-data") pod "712c7b85-7cb5-410a-9585-76642a5f47b4" (UID: "712c7b85-7cb5-410a-9585-76642a5f47b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741809 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741844 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741856 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb9102-620e-46bf-8e38-a1c74a28d07f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741864 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741872 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741881 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741888 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741896 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741904 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741934 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741942 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741951 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8j7s\" (UniqueName: \"kubernetes.io/projected/712c7b85-7cb5-410a-9585-76642a5f47b4-kube-api-access-b8j7s\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741959 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741967 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741978 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741986 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/712c7b85-7cb5-410a-9585-76642a5f47b4-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.741994 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.742001 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/712c7b85-7cb5-410a-9585-76642a5f47b4-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.742009 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712c7b85-7cb5-410a-9585-76642a5f47b4-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.745932 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.755247 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.757676 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.764634 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.843908 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:34 crc kubenswrapper[4761]: I1201 10:52:34.843937 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.141673 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" path="/var/lib/kubelet/pods/fb63d18a-acf5-4088-8616-5ccb8ab1d8f5/volumes" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.425321 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"b4bb9102-620e-46bf-8e38-a1c74a28d07f","Type":"ContainerDied","Data":"0d0be37dcb1ca6daf85b39e5608e9d4149ce90b80c778f53e8c2e9cabc2e4e06"} Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.425372 4761 scope.go:117] "RemoveContainer" containerID="e190a46e819c3f689542faf666438aab73b76f686caf515dcf68f2e22c122ec1" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.425516 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.430343 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"4f13a9fe-98e8-4536-8c58-5c53d37be913","Type":"ContainerDied","Data":"51f1379759d0fa2461b7fb0890f06150fb0f1dce593333da04fd6a366dba19e0"} Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.430467 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.437384 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"712c7b85-7cb5-410a-9585-76642a5f47b4","Type":"ContainerDied","Data":"b11d9e181d2d4e580d1d6365992635822c20c2a4c01b45c72530dca4d13bb2f4"} Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.437507 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.457183 4761 scope.go:117] "RemoveContainer" containerID="a7b7cefa9540783b0187aae6ed75244956de55095071fa47d946bb962991d746" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.467543 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.480065 4761 scope.go:117] "RemoveContainer" containerID="cd3bcf6b5bfce0e339d5be08c27f296ce63b3207a61ffe06152d8cebceaac132" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.485194 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.499210 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.522229 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.522298 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.522312 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.530029 4761 scope.go:117] "RemoveContainer" containerID="1c6a990202225e0d1072ee4fe15c2ed8021d664206b0c672ca571b9a69cdf226" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.549528 4761 scope.go:117] "RemoveContainer" containerID="74626c4abd0c6a7fcbb0b96d57faf6e14cb26c44baa26b44ab53dad1fd638f86" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.566433 4761 scope.go:117] "RemoveContainer" containerID="54b54b9ac95ce8961922abf317ed0b36b5384d05df0019c5e7bbaa28df245766" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.581198 4761 scope.go:117] "RemoveContainer" containerID="056f1ab4ff432d9d267ecee55cdacce22018ab78ea96dd3a33c199f46dadc47b" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.605011 4761 scope.go:117] "RemoveContainer" containerID="ca955628fe520877c5c313b2a1272bb50267383a04fc5679a6690781694afe07" Dec 01 10:52:35 crc kubenswrapper[4761]: I1201 10:52:35.631879 4761 scope.go:117] "RemoveContainer" containerID="79987e4329a555b177342a97ce764599c7a7d6a4211a49e91d0a5136d5c2ccc6" Dec 01 10:52:36 crc kubenswrapper[4761]: I1201 10:52:36.618407 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:36 crc kubenswrapper[4761]: I1201 10:52:36.618781 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-log" containerID="cri-o://dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e" gracePeriod=30 Dec 01 10:52:36 crc kubenswrapper[4761]: I1201 10:52:36.618885 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-api" containerID="cri-o://7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d" gracePeriod=30 Dec 01 10:52:36 crc kubenswrapper[4761]: I1201 10:52:36.618890 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-httpd" containerID="cri-o://5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8" gracePeriod=30 Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.139257 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" path="/var/lib/kubelet/pods/4f13a9fe-98e8-4536-8c58-5c53d37be913/volumes" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.140644 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" path="/var/lib/kubelet/pods/712c7b85-7cb5-410a-9585-76642a5f47b4/volumes" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.142153 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" path="/var/lib/kubelet/pods/b4bb9102-620e-46bf-8e38-a1c74a28d07f/volumes" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.144182 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.144582 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-log" containerID="cri-o://fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05" gracePeriod=30 Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.144614 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-api" containerID="cri-o://a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f" gracePeriod=30 Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.144642 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-httpd" containerID="cri-o://a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198" gracePeriod=30 Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.326983 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399267 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-nvme\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399352 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-scripts\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399401 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-run\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399407 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399441 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-httpd-run\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399456 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-run" (OuterVolumeSpecName: "run") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399466 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-dev\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399496 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399521 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-logs\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399598 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-sys\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399615 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399633 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76c6c\" (UniqueName: \"kubernetes.io/projected/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-kube-api-access-76c6c\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399651 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-config-data\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399679 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-var-locks-brick\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399694 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-lib-modules\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399714 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-iscsi\") pod \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\" (UID: \"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e\") " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.400289 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.400310 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399753 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-dev" (OuterVolumeSpecName: "dev") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399790 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-sys" (OuterVolumeSpecName: "sys") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.399783 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.400078 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-logs" (OuterVolumeSpecName: "logs") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.400101 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.400248 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.400296 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.404679 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.407978 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-kube-api-access-76c6c" (OuterVolumeSpecName: "kube-api-access-76c6c") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "kube-api-access-76c6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.408081 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-scripts" (OuterVolumeSpecName: "scripts") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.409149 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.465776 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerID="7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d" exitCode=0 Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.465814 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerID="5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8" exitCode=0 Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.465823 4761 generic.go:334] "Generic (PLEG): container finished" podID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerID="dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e" exitCode=143 Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.465862 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e","Type":"ContainerDied","Data":"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d"} Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.465892 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e","Type":"ContainerDied","Data":"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8"} Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.465901 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e","Type":"ContainerDied","Data":"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e"} Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.465910 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f4877a9b-bec0-4d49-9c5a-9890c75a3a6e","Type":"ContainerDied","Data":"99189e156412f3b829310e7f180edeef00cc8b631b6a4c39429f635a504201d0"} Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.465925 4761 scope.go:117] "RemoveContainer" containerID="7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.466040 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.471303 4761 generic.go:334] "Generic (PLEG): container finished" podID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerID="fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05" exitCode=143 Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.471349 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d64d16c7-7040-44fa-a124-e8dde25108fd","Type":"ContainerDied","Data":"fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05"} Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.480516 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-config-data" (OuterVolumeSpecName: "config-data") pod "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" (UID: "f4877a9b-bec0-4d49-9c5a-9890c75a3a6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.490031 4761 scope.go:117] "RemoveContainer" containerID="5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501728 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501759 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501788 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501803 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501814 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501832 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501842 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76c6c\" (UniqueName: \"kubernetes.io/projected/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-kube-api-access-76c6c\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501852 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501862 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501872 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501882 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.501891 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.517824 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.519730 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.521269 4761 scope.go:117] "RemoveContainer" containerID="dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.538448 4761 scope.go:117] "RemoveContainer" containerID="7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d" Dec 01 10:52:37 crc kubenswrapper[4761]: E1201 10:52:37.538783 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d\": container with ID starting with 7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d not found: ID does not exist" containerID="7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.538814 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d"} err="failed to get container status \"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d\": rpc error: code = NotFound desc = could not find container \"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d\": container with ID starting with 7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.538838 4761 scope.go:117] "RemoveContainer" containerID="5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8" Dec 01 10:52:37 crc kubenswrapper[4761]: E1201 10:52:37.539138 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8\": container with ID starting with 5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8 not found: ID does not exist" containerID="5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.539164 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8"} err="failed to get container status \"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8\": rpc error: code = NotFound desc = could not find container \"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8\": container with ID starting with 5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8 not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.539177 4761 scope.go:117] "RemoveContainer" containerID="dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e" Dec 01 10:52:37 crc kubenswrapper[4761]: E1201 10:52:37.539426 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e\": container with ID starting with dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e not found: ID does not exist" containerID="dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.539454 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e"} err="failed to get container status \"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e\": rpc error: code = NotFound desc = could not find container \"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e\": container with ID starting with dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.539470 4761 scope.go:117] "RemoveContainer" containerID="7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.539724 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d"} err="failed to get container status \"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d\": rpc error: code = NotFound desc = could not find container \"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d\": container with ID starting with 7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.539749 4761 scope.go:117] "RemoveContainer" containerID="5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.540096 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8"} err="failed to get container status \"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8\": rpc error: code = NotFound desc = could not find container \"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8\": container with ID starting with 5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8 not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.540143 4761 scope.go:117] "RemoveContainer" containerID="dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.540896 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e"} err="failed to get container status \"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e\": rpc error: code = NotFound desc = could not find container \"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e\": container with ID starting with dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.540919 4761 scope.go:117] "RemoveContainer" containerID="7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.541142 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d"} err="failed to get container status \"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d\": rpc error: code = NotFound desc = could not find container \"7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d\": container with ID starting with 7b842c975272e101357212b51433c73ab52c61c58f6144f7347fa113f1adee0d not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.541169 4761 scope.go:117] "RemoveContainer" containerID="5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.541350 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8"} err="failed to get container status \"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8\": rpc error: code = NotFound desc = could not find container \"5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8\": container with ID starting with 5acc3e3c99dc4b028f39e867812cf375574c9eceba94060130aa7c570c576fa8 not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.541378 4761 scope.go:117] "RemoveContainer" containerID="dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.541651 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e"} err="failed to get container status \"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e\": rpc error: code = NotFound desc = could not find container \"dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e\": container with ID starting with dc2b7528487940a62da83c17a403bc6e42b8b6e316df24492dcfe24db1f83d7e not found: ID does not exist" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.605884 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.605917 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.806085 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:37 crc kubenswrapper[4761]: I1201 10:52:37.812511 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:52:38 crc kubenswrapper[4761]: I1201 10:52:38.484261 4761 generic.go:334] "Generic (PLEG): container finished" podID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerID="a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f" exitCode=0 Dec 01 10:52:38 crc kubenswrapper[4761]: I1201 10:52:38.484317 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d64d16c7-7040-44fa-a124-e8dde25108fd","Type":"ContainerDied","Data":"a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f"} Dec 01 10:52:38 crc kubenswrapper[4761]: I1201 10:52:38.925301 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.028853 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-config-data\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.028921 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-run\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.028957 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-nvme\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.028983 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-dev\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029009 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029039 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-scripts\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029075 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-lib-modules\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029149 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69rgr\" (UniqueName: \"kubernetes.io/projected/d64d16c7-7040-44fa-a124-e8dde25108fd-kube-api-access-69rgr\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029216 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-logs\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029142 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-run" (OuterVolumeSpecName: "run") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029221 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-dev" (OuterVolumeSpecName: "dev") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029248 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-httpd-run\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029395 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-var-locks-brick\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029434 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-iscsi\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029456 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-sys\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029505 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d64d16c7-7040-44fa-a124-e8dde25108fd\" (UID: \"d64d16c7-7040-44fa-a124-e8dde25108fd\") " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029178 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029208 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029674 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029710 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-logs" (OuterVolumeSpecName: "logs") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029764 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029725 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.029797 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-sys" (OuterVolumeSpecName: "sys") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030183 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030212 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030225 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030236 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030247 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030258 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64d16c7-7040-44fa-a124-e8dde25108fd-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030269 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030281 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.030292 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d64d16c7-7040-44fa-a124-e8dde25108fd-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.034030 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64d16c7-7040-44fa-a124-e8dde25108fd-kube-api-access-69rgr" (OuterVolumeSpecName: "kube-api-access-69rgr") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "kube-api-access-69rgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.034376 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-scripts" (OuterVolumeSpecName: "scripts") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.034905 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.036418 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.094579 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-config-data" (OuterVolumeSpecName: "config-data") pod "d64d16c7-7040-44fa-a124-e8dde25108fd" (UID: "d64d16c7-7040-44fa-a124-e8dde25108fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.131411 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.131456 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.131482 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.131494 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64d16c7-7040-44fa-a124-e8dde25108fd-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.131511 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69rgr\" (UniqueName: \"kubernetes.io/projected/d64d16c7-7040-44fa-a124-e8dde25108fd-kube-api-access-69rgr\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.138533 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" path="/var/lib/kubelet/pods/f4877a9b-bec0-4d49-9c5a-9890c75a3a6e/volumes" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.149574 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.156072 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.233351 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.233383 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.495271 4761 generic.go:334] "Generic (PLEG): container finished" podID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerID="a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198" exitCode=0 Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.495323 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d64d16c7-7040-44fa-a124-e8dde25108fd","Type":"ContainerDied","Data":"a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198"} Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.495358 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d64d16c7-7040-44fa-a124-e8dde25108fd","Type":"ContainerDied","Data":"2e4445fe3fad073b2628b1effc84a4aa35ddfe00d4dade841ac9734c3e6b53b0"} Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.495385 4761 scope.go:117] "RemoveContainer" containerID="a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.495620 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.528014 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.533187 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.535089 4761 scope.go:117] "RemoveContainer" containerID="a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.561904 4761 scope.go:117] "RemoveContainer" containerID="fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.583423 4761 scope.go:117] "RemoveContainer" containerID="a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f" Dec 01 10:52:39 crc kubenswrapper[4761]: E1201 10:52:39.584939 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f\": container with ID starting with a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f not found: ID does not exist" containerID="a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.584979 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f"} err="failed to get container status \"a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f\": rpc error: code = NotFound desc = could not find container \"a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f\": container with ID starting with a3d8dec23a205b5fec34cc73cddc2b5fd421cc502f36d7dc2d93c1bf46b6fa6f not found: ID does not exist" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.585004 4761 scope.go:117] "RemoveContainer" containerID="a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198" Dec 01 10:52:39 crc kubenswrapper[4761]: E1201 10:52:39.585611 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198\": container with ID starting with a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198 not found: ID does not exist" containerID="a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.585685 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198"} err="failed to get container status \"a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198\": rpc error: code = NotFound desc = could not find container \"a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198\": container with ID starting with a2610d5329bfd8e1d359fd6ddbafa2bdfbe6649d14deb170dcf10c2f472b4198 not found: ID does not exist" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.585712 4761 scope.go:117] "RemoveContainer" containerID="fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05" Dec 01 10:52:39 crc kubenswrapper[4761]: E1201 10:52:39.586058 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05\": container with ID starting with fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05 not found: ID does not exist" containerID="fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05" Dec 01 10:52:39 crc kubenswrapper[4761]: I1201 10:52:39.586086 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05"} err="failed to get container status \"fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05\": rpc error: code = NotFound desc = could not find container \"fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05\": container with ID starting with fb9571930128084802007f46991a1ac6f16fe771ccb036c626c7d76349e5bf05 not found: ID does not exist" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.890965 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-27bkx"] Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.898851 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-27bkx"] Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.918656 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance065c-account-delete-x8qvh"] Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.918976 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.918996 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919027 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919036 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919046 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919054 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919070 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919079 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919101 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919108 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919127 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919135 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919146 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919154 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919165 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919173 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919189 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919196 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919214 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919221 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919234 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919243 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919265 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919275 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919298 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919308 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919325 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919334 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919349 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919359 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919383 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919392 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919405 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919415 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: E1201 10:52:40.919429 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919440 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919640 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919663 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919676 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919690 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919702 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919718 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919735 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919748 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919763 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919783 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919802 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919820 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919833 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4877a9b-bec0-4d49-9c5a-9890c75a3a6e" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919843 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-log" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919859 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb63d18a-acf5-4088-8616-5ccb8ab1d8f5" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919875 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f13a9fe-98e8-4536-8c58-5c53d37be913" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919887 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="712c7b85-7cb5-410a-9585-76642a5f47b4" containerName="glance-httpd" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.919901 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bb9102-620e-46bf-8e38-a1c74a28d07f" containerName="glance-api" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.920541 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:40 crc kubenswrapper[4761]: I1201 10:52:40.940535 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance065c-account-delete-x8qvh"] Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.064811 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grdv7\" (UniqueName: \"kubernetes.io/projected/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-kube-api-access-grdv7\") pod \"glance065c-account-delete-x8qvh\" (UID: \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\") " pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.064909 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-operator-scripts\") pod \"glance065c-account-delete-x8qvh\" (UID: \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\") " pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.137361 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64d16c7-7040-44fa-a124-e8dde25108fd" path="/var/lib/kubelet/pods/d64d16c7-7040-44fa-a124-e8dde25108fd/volumes" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.138176 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53292ab-0210-41f8-a531-9ad4e8d0cffc" path="/var/lib/kubelet/pods/f53292ab-0210-41f8-a531-9ad4e8d0cffc/volumes" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.169332 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-operator-scripts\") pod \"glance065c-account-delete-x8qvh\" (UID: \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\") " pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.169479 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grdv7\" (UniqueName: \"kubernetes.io/projected/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-kube-api-access-grdv7\") pod \"glance065c-account-delete-x8qvh\" (UID: \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\") " pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.170235 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-operator-scripts\") pod \"glance065c-account-delete-x8qvh\" (UID: \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\") " pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.202345 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grdv7\" (UniqueName: \"kubernetes.io/projected/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-kube-api-access-grdv7\") pod \"glance065c-account-delete-x8qvh\" (UID: \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\") " pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.287090 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:41 crc kubenswrapper[4761]: I1201 10:52:41.708404 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance065c-account-delete-x8qvh"] Dec 01 10:52:41 crc kubenswrapper[4761]: W1201 10:52:41.714483 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeee5fae0_2e3e_4ca0_ac63_6f01d77f2a9e.slice/crio-95904b187f0eb2807a3a0c9edc9ee645783f857c9d9cf30cb64a423e7da479d9 WatchSource:0}: Error finding container 95904b187f0eb2807a3a0c9edc9ee645783f857c9d9cf30cb64a423e7da479d9: Status 404 returned error can't find the container with id 95904b187f0eb2807a3a0c9edc9ee645783f857c9d9cf30cb64a423e7da479d9 Dec 01 10:52:42 crc kubenswrapper[4761]: I1201 10:52:42.524153 4761 generic.go:334] "Generic (PLEG): container finished" podID="eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e" containerID="7964fbd6798c56ca805db0689a7899be84af2c935218a9cc77e4848aef4e500a" exitCode=0 Dec 01 10:52:42 crc kubenswrapper[4761]: I1201 10:52:42.524462 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" event={"ID":"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e","Type":"ContainerDied","Data":"7964fbd6798c56ca805db0689a7899be84af2c935218a9cc77e4848aef4e500a"} Dec 01 10:52:42 crc kubenswrapper[4761]: I1201 10:52:42.524488 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" event={"ID":"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e","Type":"ContainerStarted","Data":"95904b187f0eb2807a3a0c9edc9ee645783f857c9d9cf30cb64a423e7da479d9"} Dec 01 10:52:43 crc kubenswrapper[4761]: I1201 10:52:43.881098 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.029146 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grdv7\" (UniqueName: \"kubernetes.io/projected/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-kube-api-access-grdv7\") pod \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\" (UID: \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\") " Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.029357 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-operator-scripts\") pod \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\" (UID: \"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e\") " Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.030698 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e" (UID: "eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.044760 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-kube-api-access-grdv7" (OuterVolumeSpecName: "kube-api-access-grdv7") pod "eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e" (UID: "eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e"). InnerVolumeSpecName "kube-api-access-grdv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.131401 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.131454 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grdv7\" (UniqueName: \"kubernetes.io/projected/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e-kube-api-access-grdv7\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.542412 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" event={"ID":"eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e","Type":"ContainerDied","Data":"95904b187f0eb2807a3a0c9edc9ee645783f857c9d9cf30cb64a423e7da479d9"} Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.542797 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95904b187f0eb2807a3a0c9edc9ee645783f857c9d9cf30cb64a423e7da479d9" Dec 01 10:52:44 crc kubenswrapper[4761]: I1201 10:52:44.542498 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance065c-account-delete-x8qvh" Dec 01 10:52:45 crc kubenswrapper[4761]: I1201 10:52:45.969200 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-cm2gp"] Dec 01 10:52:45 crc kubenswrapper[4761]: I1201 10:52:45.980816 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-cm2gp"] Dec 01 10:52:45 crc kubenswrapper[4761]: I1201 10:52:45.990040 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance065c-account-delete-x8qvh"] Dec 01 10:52:45 crc kubenswrapper[4761]: I1201 10:52:45.996721 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-065c-account-create-update-kspzx"] Dec 01 10:52:46 crc kubenswrapper[4761]: I1201 10:52:46.004391 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance065c-account-delete-x8qvh"] Dec 01 10:52:46 crc kubenswrapper[4761]: I1201 10:52:46.011385 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-065c-account-create-update-kspzx"] Dec 01 10:52:46 crc kubenswrapper[4761]: I1201 10:52:46.940037 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-scc2g"] Dec 01 10:52:46 crc kubenswrapper[4761]: E1201 10:52:46.940392 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e" containerName="mariadb-account-delete" Dec 01 10:52:46 crc kubenswrapper[4761]: I1201 10:52:46.940410 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e" containerName="mariadb-account-delete" Dec 01 10:52:46 crc kubenswrapper[4761]: I1201 10:52:46.940583 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e" containerName="mariadb-account-delete" Dec 01 10:52:46 crc kubenswrapper[4761]: I1201 10:52:46.941358 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:46 crc kubenswrapper[4761]: I1201 10:52:46.950121 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-scc2g"] Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.042390 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-af1e-account-create-update-cs6qz"] Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.043474 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.045981 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.057500 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-af1e-account-create-update-cs6qz"] Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.083600 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91c91560-7b25-4429-94e9-aa13a92bf417-operator-scripts\") pod \"glance-db-create-scc2g\" (UID: \"91c91560-7b25-4429-94e9-aa13a92bf417\") " pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.083749 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp5lk\" (UniqueName: \"kubernetes.io/projected/91c91560-7b25-4429-94e9-aa13a92bf417-kube-api-access-rp5lk\") pod \"glance-db-create-scc2g\" (UID: \"91c91560-7b25-4429-94e9-aa13a92bf417\") " pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.138200 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06997588-fb51-4b96-9f3a-22910179ca36" path="/var/lib/kubelet/pods/06997588-fb51-4b96-9f3a-22910179ca36/volumes" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.138885 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8724ddb4-e098-4eb8-9173-946c5b080b42" path="/var/lib/kubelet/pods/8724ddb4-e098-4eb8-9173-946c5b080b42/volumes" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.139362 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e" path="/var/lib/kubelet/pods/eee5fae0-2e3e-4ca0-ac63-6f01d77f2a9e/volumes" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.185756 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9664d36-3b98-46d7-8af1-b1636a2f370a-operator-scripts\") pod \"glance-af1e-account-create-update-cs6qz\" (UID: \"c9664d36-3b98-46d7-8af1-b1636a2f370a\") " pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.185864 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp5lk\" (UniqueName: \"kubernetes.io/projected/91c91560-7b25-4429-94e9-aa13a92bf417-kube-api-access-rp5lk\") pod \"glance-db-create-scc2g\" (UID: \"91c91560-7b25-4429-94e9-aa13a92bf417\") " pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.185946 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5hd\" (UniqueName: \"kubernetes.io/projected/c9664d36-3b98-46d7-8af1-b1636a2f370a-kube-api-access-cm5hd\") pod \"glance-af1e-account-create-update-cs6qz\" (UID: \"c9664d36-3b98-46d7-8af1-b1636a2f370a\") " pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.186026 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91c91560-7b25-4429-94e9-aa13a92bf417-operator-scripts\") pod \"glance-db-create-scc2g\" (UID: \"91c91560-7b25-4429-94e9-aa13a92bf417\") " pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.187468 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91c91560-7b25-4429-94e9-aa13a92bf417-operator-scripts\") pod \"glance-db-create-scc2g\" (UID: \"91c91560-7b25-4429-94e9-aa13a92bf417\") " pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.220961 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp5lk\" (UniqueName: \"kubernetes.io/projected/91c91560-7b25-4429-94e9-aa13a92bf417-kube-api-access-rp5lk\") pod \"glance-db-create-scc2g\" (UID: \"91c91560-7b25-4429-94e9-aa13a92bf417\") " pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.280222 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.287118 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9664d36-3b98-46d7-8af1-b1636a2f370a-operator-scripts\") pod \"glance-af1e-account-create-update-cs6qz\" (UID: \"c9664d36-3b98-46d7-8af1-b1636a2f370a\") " pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.287195 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5hd\" (UniqueName: \"kubernetes.io/projected/c9664d36-3b98-46d7-8af1-b1636a2f370a-kube-api-access-cm5hd\") pod \"glance-af1e-account-create-update-cs6qz\" (UID: \"c9664d36-3b98-46d7-8af1-b1636a2f370a\") " pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.288976 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9664d36-3b98-46d7-8af1-b1636a2f370a-operator-scripts\") pod \"glance-af1e-account-create-update-cs6qz\" (UID: \"c9664d36-3b98-46d7-8af1-b1636a2f370a\") " pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.310081 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5hd\" (UniqueName: \"kubernetes.io/projected/c9664d36-3b98-46d7-8af1-b1636a2f370a-kube-api-access-cm5hd\") pod \"glance-af1e-account-create-update-cs6qz\" (UID: \"c9664d36-3b98-46d7-8af1-b1636a2f370a\") " pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.362003 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.532620 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-scc2g"] Dec 01 10:52:47 crc kubenswrapper[4761]: W1201 10:52:47.541736 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91c91560_7b25_4429_94e9_aa13a92bf417.slice/crio-21f8b57b581434ac8ebcabb33558b9eb6a3265629496201ce14c4c72411d61e5 WatchSource:0}: Error finding container 21f8b57b581434ac8ebcabb33558b9eb6a3265629496201ce14c4c72411d61e5: Status 404 returned error can't find the container with id 21f8b57b581434ac8ebcabb33558b9eb6a3265629496201ce14c4c72411d61e5 Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.591390 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-scc2g" event={"ID":"91c91560-7b25-4429-94e9-aa13a92bf417","Type":"ContainerStarted","Data":"21f8b57b581434ac8ebcabb33558b9eb6a3265629496201ce14c4c72411d61e5"} Dec 01 10:52:47 crc kubenswrapper[4761]: I1201 10:52:47.918165 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-af1e-account-create-update-cs6qz"] Dec 01 10:52:47 crc kubenswrapper[4761]: W1201 10:52:47.925138 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9664d36_3b98_46d7_8af1_b1636a2f370a.slice/crio-a5ec4b84d12923f61d3a726af369ac1e3987700d60d7ed5b67ec5de641bed0d3 WatchSource:0}: Error finding container a5ec4b84d12923f61d3a726af369ac1e3987700d60d7ed5b67ec5de641bed0d3: Status 404 returned error can't find the container with id a5ec4b84d12923f61d3a726af369ac1e3987700d60d7ed5b67ec5de641bed0d3 Dec 01 10:52:48 crc kubenswrapper[4761]: I1201 10:52:48.602763 4761 generic.go:334] "Generic (PLEG): container finished" podID="91c91560-7b25-4429-94e9-aa13a92bf417" containerID="24d0f43937ac60404dda8bc82a721a04db080713375c4f92443dac9ba5ebe113" exitCode=0 Dec 01 10:52:48 crc kubenswrapper[4761]: I1201 10:52:48.602860 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-scc2g" event={"ID":"91c91560-7b25-4429-94e9-aa13a92bf417","Type":"ContainerDied","Data":"24d0f43937ac60404dda8bc82a721a04db080713375c4f92443dac9ba5ebe113"} Dec 01 10:52:48 crc kubenswrapper[4761]: I1201 10:52:48.606969 4761 generic.go:334] "Generic (PLEG): container finished" podID="c9664d36-3b98-46d7-8af1-b1636a2f370a" containerID="dc4fb35971e2f60efc7a8d59fd1284528556b794ba4733b76cb1c50ed7890466" exitCode=0 Dec 01 10:52:48 crc kubenswrapper[4761]: I1201 10:52:48.607074 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" event={"ID":"c9664d36-3b98-46d7-8af1-b1636a2f370a","Type":"ContainerDied","Data":"dc4fb35971e2f60efc7a8d59fd1284528556b794ba4733b76cb1c50ed7890466"} Dec 01 10:52:48 crc kubenswrapper[4761]: I1201 10:52:48.607451 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" event={"ID":"c9664d36-3b98-46d7-8af1-b1636a2f370a","Type":"ContainerStarted","Data":"a5ec4b84d12923f61d3a726af369ac1e3987700d60d7ed5b67ec5de641bed0d3"} Dec 01 10:52:49 crc kubenswrapper[4761]: I1201 10:52:49.971451 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:49 crc kubenswrapper[4761]: I1201 10:52:49.979138 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.141747 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9664d36-3b98-46d7-8af1-b1636a2f370a-operator-scripts\") pod \"c9664d36-3b98-46d7-8af1-b1636a2f370a\" (UID: \"c9664d36-3b98-46d7-8af1-b1636a2f370a\") " Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.141880 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm5hd\" (UniqueName: \"kubernetes.io/projected/c9664d36-3b98-46d7-8af1-b1636a2f370a-kube-api-access-cm5hd\") pod \"c9664d36-3b98-46d7-8af1-b1636a2f370a\" (UID: \"c9664d36-3b98-46d7-8af1-b1636a2f370a\") " Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.141967 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91c91560-7b25-4429-94e9-aa13a92bf417-operator-scripts\") pod \"91c91560-7b25-4429-94e9-aa13a92bf417\" (UID: \"91c91560-7b25-4429-94e9-aa13a92bf417\") " Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.142013 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp5lk\" (UniqueName: \"kubernetes.io/projected/91c91560-7b25-4429-94e9-aa13a92bf417-kube-api-access-rp5lk\") pod \"91c91560-7b25-4429-94e9-aa13a92bf417\" (UID: \"91c91560-7b25-4429-94e9-aa13a92bf417\") " Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.142682 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c91560-7b25-4429-94e9-aa13a92bf417-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91c91560-7b25-4429-94e9-aa13a92bf417" (UID: "91c91560-7b25-4429-94e9-aa13a92bf417"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.142711 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9664d36-3b98-46d7-8af1-b1636a2f370a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9664d36-3b98-46d7-8af1-b1636a2f370a" (UID: "c9664d36-3b98-46d7-8af1-b1636a2f370a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.147095 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c91560-7b25-4429-94e9-aa13a92bf417-kube-api-access-rp5lk" (OuterVolumeSpecName: "kube-api-access-rp5lk") pod "91c91560-7b25-4429-94e9-aa13a92bf417" (UID: "91c91560-7b25-4429-94e9-aa13a92bf417"). InnerVolumeSpecName "kube-api-access-rp5lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.147338 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9664d36-3b98-46d7-8af1-b1636a2f370a-kube-api-access-cm5hd" (OuterVolumeSpecName: "kube-api-access-cm5hd") pod "c9664d36-3b98-46d7-8af1-b1636a2f370a" (UID: "c9664d36-3b98-46d7-8af1-b1636a2f370a"). InnerVolumeSpecName "kube-api-access-cm5hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.246615 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm5hd\" (UniqueName: \"kubernetes.io/projected/c9664d36-3b98-46d7-8af1-b1636a2f370a-kube-api-access-cm5hd\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.246662 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91c91560-7b25-4429-94e9-aa13a92bf417-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.246676 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp5lk\" (UniqueName: \"kubernetes.io/projected/91c91560-7b25-4429-94e9-aa13a92bf417-kube-api-access-rp5lk\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.246687 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9664d36-3b98-46d7-8af1-b1636a2f370a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.630688 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.630690 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-af1e-account-create-update-cs6qz" event={"ID":"c9664d36-3b98-46d7-8af1-b1636a2f370a","Type":"ContainerDied","Data":"a5ec4b84d12923f61d3a726af369ac1e3987700d60d7ed5b67ec5de641bed0d3"} Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.630877 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ec4b84d12923f61d3a726af369ac1e3987700d60d7ed5b67ec5de641bed0d3" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.633283 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-scc2g" event={"ID":"91c91560-7b25-4429-94e9-aa13a92bf417","Type":"ContainerDied","Data":"21f8b57b581434ac8ebcabb33558b9eb6a3265629496201ce14c4c72411d61e5"} Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.633311 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f8b57b581434ac8ebcabb33558b9eb6a3265629496201ce14c4c72411d61e5" Dec 01 10:52:50 crc kubenswrapper[4761]: I1201 10:52:50.633392 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-scc2g" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.169964 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-hw5vn"] Dec 01 10:52:52 crc kubenswrapper[4761]: E1201 10:52:52.170670 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c91560-7b25-4429-94e9-aa13a92bf417" containerName="mariadb-database-create" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.170685 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c91560-7b25-4429-94e9-aa13a92bf417" containerName="mariadb-database-create" Dec 01 10:52:52 crc kubenswrapper[4761]: E1201 10:52:52.170707 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9664d36-3b98-46d7-8af1-b1636a2f370a" containerName="mariadb-account-create-update" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.170716 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9664d36-3b98-46d7-8af1-b1636a2f370a" containerName="mariadb-account-create-update" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.170838 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c91560-7b25-4429-94e9-aa13a92bf417" containerName="mariadb-database-create" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.170860 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9664d36-3b98-46d7-8af1-b1636a2f370a" containerName="mariadb-account-create-update" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.171384 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.173373 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.173749 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-7m7mk" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.175830 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rftb\" (UniqueName: \"kubernetes.io/projected/d8a426ca-a5fe-464a-951f-53ae1db79b1e-kube-api-access-9rftb\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.175881 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-db-sync-config-data\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.175917 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-config-data\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.182883 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-hw5vn"] Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.277216 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-db-sync-config-data\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.277451 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-config-data\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.277616 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rftb\" (UniqueName: \"kubernetes.io/projected/d8a426ca-a5fe-464a-951f-53ae1db79b1e-kube-api-access-9rftb\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.282824 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-config-data\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.283202 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-db-sync-config-data\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.299123 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rftb\" (UniqueName: \"kubernetes.io/projected/d8a426ca-a5fe-464a-951f-53ae1db79b1e-kube-api-access-9rftb\") pod \"glance-db-sync-hw5vn\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.533595 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:52 crc kubenswrapper[4761]: I1201 10:52:52.939757 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-hw5vn"] Dec 01 10:52:52 crc kubenswrapper[4761]: W1201 10:52:52.948858 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8a426ca_a5fe_464a_951f_53ae1db79b1e.slice/crio-b5340857df7dfb545d4f4024ea6a054c3e188675eb091823b0ab95da666a6644 WatchSource:0}: Error finding container b5340857df7dfb545d4f4024ea6a054c3e188675eb091823b0ab95da666a6644: Status 404 returned error can't find the container with id b5340857df7dfb545d4f4024ea6a054c3e188675eb091823b0ab95da666a6644 Dec 01 10:52:53 crc kubenswrapper[4761]: I1201 10:52:53.656625 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-hw5vn" event={"ID":"d8a426ca-a5fe-464a-951f-53ae1db79b1e","Type":"ContainerStarted","Data":"aebc17b337b0f856ab6f9c36fc6a95761ed0a40b7c965f962a03e914b4350778"} Dec 01 10:52:53 crc kubenswrapper[4761]: I1201 10:52:53.656962 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-hw5vn" event={"ID":"d8a426ca-a5fe-464a-951f-53ae1db79b1e","Type":"ContainerStarted","Data":"b5340857df7dfb545d4f4024ea6a054c3e188675eb091823b0ab95da666a6644"} Dec 01 10:52:53 crc kubenswrapper[4761]: I1201 10:52:53.679568 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-hw5vn" podStartSLOduration=1.679529321 podStartE2EDuration="1.679529321s" podCreationTimestamp="2025-12-01 10:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:53.674077653 +0000 UTC m=+1312.977836277" watchObservedRunningTime="2025-12-01 10:52:53.679529321 +0000 UTC m=+1312.983287955" Dec 01 10:52:56 crc kubenswrapper[4761]: I1201 10:52:56.696290 4761 generic.go:334] "Generic (PLEG): container finished" podID="d8a426ca-a5fe-464a-951f-53ae1db79b1e" containerID="aebc17b337b0f856ab6f9c36fc6a95761ed0a40b7c965f962a03e914b4350778" exitCode=0 Dec 01 10:52:56 crc kubenswrapper[4761]: I1201 10:52:56.696596 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-hw5vn" event={"ID":"d8a426ca-a5fe-464a-951f-53ae1db79b1e","Type":"ContainerDied","Data":"aebc17b337b0f856ab6f9c36fc6a95761ed0a40b7c965f962a03e914b4350778"} Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.067764 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.076065 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-db-sync-config-data\") pod \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.076171 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rftb\" (UniqueName: \"kubernetes.io/projected/d8a426ca-a5fe-464a-951f-53ae1db79b1e-kube-api-access-9rftb\") pod \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.076358 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-config-data\") pod \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\" (UID: \"d8a426ca-a5fe-464a-951f-53ae1db79b1e\") " Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.082640 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d8a426ca-a5fe-464a-951f-53ae1db79b1e" (UID: "d8a426ca-a5fe-464a-951f-53ae1db79b1e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.082675 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a426ca-a5fe-464a-951f-53ae1db79b1e-kube-api-access-9rftb" (OuterVolumeSpecName: "kube-api-access-9rftb") pod "d8a426ca-a5fe-464a-951f-53ae1db79b1e" (UID: "d8a426ca-a5fe-464a-951f-53ae1db79b1e"). InnerVolumeSpecName "kube-api-access-9rftb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.130442 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-config-data" (OuterVolumeSpecName: "config-data") pod "d8a426ca-a5fe-464a-951f-53ae1db79b1e" (UID: "d8a426ca-a5fe-464a-951f-53ae1db79b1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.178098 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.178142 4761 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8a426ca-a5fe-464a-951f-53ae1db79b1e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.178163 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rftb\" (UniqueName: \"kubernetes.io/projected/d8a426ca-a5fe-464a-951f-53ae1db79b1e-kube-api-access-9rftb\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.721958 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-hw5vn" event={"ID":"d8a426ca-a5fe-464a-951f-53ae1db79b1e","Type":"ContainerDied","Data":"b5340857df7dfb545d4f4024ea6a054c3e188675eb091823b0ab95da666a6644"} Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.722330 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5340857df7dfb545d4f4024ea6a054c3e188675eb091823b0ab95da666a6644" Dec 01 10:52:58 crc kubenswrapper[4761]: I1201 10:52:58.722050 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-hw5vn" Dec 01 10:52:59 crc kubenswrapper[4761]: I1201 10:52:59.882524 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:52:59 crc kubenswrapper[4761]: E1201 10:52:59.883468 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a426ca-a5fe-464a-951f-53ae1db79b1e" containerName="glance-db-sync" Dec 01 10:52:59 crc kubenswrapper[4761]: I1201 10:52:59.883622 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a426ca-a5fe-464a-951f-53ae1db79b1e" containerName="glance-db-sync" Dec 01 10:52:59 crc kubenswrapper[4761]: I1201 10:52:59.883822 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a426ca-a5fe-464a-951f-53ae1db79b1e" containerName="glance-db-sync" Dec 01 10:52:59 crc kubenswrapper[4761]: I1201 10:52:59.884748 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:52:59 crc kubenswrapper[4761]: I1201 10:52:59.886978 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-7m7mk" Dec 01 10:52:59 crc kubenswrapper[4761]: I1201 10:52:59.887357 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 01 10:52:59 crc kubenswrapper[4761]: I1201 10:52:59.888369 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Dec 01 10:52:59 crc kubenswrapper[4761]: I1201 10:52:59.896665 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.006755 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.006799 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-run\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.006833 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-dev\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.006853 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjll2\" (UniqueName: \"kubernetes.io/projected/ac516c9e-a80c-48c3-9a29-809f073fa66f-kube-api-access-mjll2\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.006877 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.006892 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-sys\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.007033 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.007119 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.007159 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.007309 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.007354 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.007396 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.007505 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.007542 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109432 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjll2\" (UniqueName: \"kubernetes.io/projected/ac516c9e-a80c-48c3-9a29-809f073fa66f-kube-api-access-mjll2\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109486 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109513 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-sys\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109542 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109583 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109600 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109649 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109665 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109684 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109713 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109744 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109780 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109799 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-run\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109828 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-dev\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109897 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-dev\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.109951 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-sys\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.110181 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.110701 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.110830 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.110838 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-run\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.110874 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.110874 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.111066 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.111171 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.111165 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.116054 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.116084 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.139129 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.141003 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.141359 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjll2\" (UniqueName: \"kubernetes.io/projected/ac516c9e-a80c-48c3-9a29-809f073fa66f-kube-api-access-mjll2\") pod \"glance-default-external-api-0\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.207236 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.241063 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.242523 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.244575 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.272012 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.314826 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.314911 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-run\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.314953 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-logs\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.314989 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315022 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8xk\" (UniqueName: \"kubernetes.io/projected/6562f5f3-d175-4f16-90d8-00693b027f88-kube-api-access-6b8xk\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315182 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315257 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315325 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-sys\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315351 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315387 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-dev\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315501 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315601 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315635 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.315667 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417130 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-logs\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417605 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417643 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8xk\" (UniqueName: \"kubernetes.io/projected/6562f5f3-d175-4f16-90d8-00693b027f88-kube-api-access-6b8xk\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417681 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417699 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417724 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-sys\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417737 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417755 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-dev\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417775 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417797 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417815 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417832 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417858 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417878 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-run\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417929 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-run\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.417542 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-logs\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.418132 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.420703 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-dev\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.421432 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.421496 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.421532 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-sys\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.421605 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.423735 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.423769 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.423847 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.431871 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.432884 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.474371 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.480005 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.498330 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8xk\" (UniqueName: \"kubernetes.io/projected/6562f5f3-d175-4f16-90d8-00693b027f88-kube-api-access-6b8xk\") pod \"glance-default-internal-api-0\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.524909 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:53:00 crc kubenswrapper[4761]: W1201 10:53:00.537706 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac516c9e_a80c_48c3_9a29_809f073fa66f.slice/crio-50dbf3f56a4f7ae5f4579327bf42cb284b618549d1ffb009e81b5c598d41dcb8 WatchSource:0}: Error finding container 50dbf3f56a4f7ae5f4579327bf42cb284b618549d1ffb009e81b5c598d41dcb8: Status 404 returned error can't find the container with id 50dbf3f56a4f7ae5f4579327bf42cb284b618549d1ffb009e81b5c598d41dcb8 Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.624243 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.737997 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ac516c9e-a80c-48c3-9a29-809f073fa66f","Type":"ContainerStarted","Data":"53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89"} Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.738040 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ac516c9e-a80c-48c3-9a29-809f073fa66f","Type":"ContainerStarted","Data":"50dbf3f56a4f7ae5f4579327bf42cb284b618549d1ffb009e81b5c598d41dcb8"} Dec 01 10:53:00 crc kubenswrapper[4761]: I1201 10:53:00.834159 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.035112 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:01 crc kubenswrapper[4761]: W1201 10:53:01.035139 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6562f5f3_d175_4f16_90d8_00693b027f88.slice/crio-92d2f7fe33b1a821dd28768b8567a99bb0f951f901e12add48299a615b1b35f4 WatchSource:0}: Error finding container 92d2f7fe33b1a821dd28768b8567a99bb0f951f901e12add48299a615b1b35f4: Status 404 returned error can't find the container with id 92d2f7fe33b1a821dd28768b8567a99bb0f951f901e12add48299a615b1b35f4 Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.754136 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ac516c9e-a80c-48c3-9a29-809f073fa66f","Type":"ContainerStarted","Data":"f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4"} Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.758246 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6562f5f3-d175-4f16-90d8-00693b027f88","Type":"ContainerStarted","Data":"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f"} Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.758418 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6562f5f3-d175-4f16-90d8-00693b027f88","Type":"ContainerStarted","Data":"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf"} Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.758590 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6562f5f3-d175-4f16-90d8-00693b027f88","Type":"ContainerStarted","Data":"92d2f7fe33b1a821dd28768b8567a99bb0f951f901e12add48299a615b1b35f4"} Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.758413 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" containerName="glance-httpd" containerID="cri-o://0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f" gracePeriod=30 Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.758349 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" containerName="glance-log" containerID="cri-o://7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf" gracePeriod=30 Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.783076 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.783056336 podStartE2EDuration="2.783056336s" podCreationTimestamp="2025-12-01 10:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:53:01.774765439 +0000 UTC m=+1321.078524103" watchObservedRunningTime="2025-12-01 10:53:01.783056336 +0000 UTC m=+1321.086814970" Dec 01 10:53:01 crc kubenswrapper[4761]: I1201 10:53:01.809735 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.809706553 podStartE2EDuration="2.809706553s" podCreationTimestamp="2025-12-01 10:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:53:01.798069806 +0000 UTC m=+1321.101828470" watchObservedRunningTime="2025-12-01 10:53:01.809706553 +0000 UTC m=+1321.113465217" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.194424 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244015 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-var-locks-brick\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244304 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244336 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-nvme\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244390 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244426 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-run\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244454 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244465 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-lib-modules\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244482 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-run" (OuterVolumeSpecName: "run") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244525 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-dev\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244582 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244626 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-scripts\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244676 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-dev" (OuterVolumeSpecName: "dev") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244688 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b8xk\" (UniqueName: \"kubernetes.io/projected/6562f5f3-d175-4f16-90d8-00693b027f88-kube-api-access-6b8xk\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244714 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-iscsi\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244737 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-httpd-run\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244759 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-logs\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244777 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-sys\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244805 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.244841 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-config-data\") pod \"6562f5f3-d175-4f16-90d8-00693b027f88\" (UID: \"6562f5f3-d175-4f16-90d8-00693b027f88\") " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.245038 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-sys" (OuterVolumeSpecName: "sys") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.245172 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.245498 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-logs" (OuterVolumeSpecName: "logs") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.245516 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.246182 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.246212 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.246253 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.246272 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.246287 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.246329 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.246343 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6562f5f3-d175-4f16-90d8-00693b027f88-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.249743 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.254874 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-scripts" (OuterVolumeSpecName: "scripts") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.255740 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6562f5f3-d175-4f16-90d8-00693b027f88-kube-api-access-6b8xk" (OuterVolumeSpecName: "kube-api-access-6b8xk") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "kube-api-access-6b8xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.256414 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.297299 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-config-data" (OuterVolumeSpecName: "config-data") pod "6562f5f3-d175-4f16-90d8-00693b027f88" (UID: "6562f5f3-d175-4f16-90d8-00693b027f88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.347437 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.347504 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.347519 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6562f5f3-d175-4f16-90d8-00693b027f88-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.347532 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b8xk\" (UniqueName: \"kubernetes.io/projected/6562f5f3-d175-4f16-90d8-00693b027f88-kube-api-access-6b8xk\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.347568 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.347579 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6562f5f3-d175-4f16-90d8-00693b027f88-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.347598 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.360337 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.365482 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.448175 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.448344 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.771358 4761 generic.go:334] "Generic (PLEG): container finished" podID="6562f5f3-d175-4f16-90d8-00693b027f88" containerID="0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f" exitCode=143 Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.771403 4761 generic.go:334] "Generic (PLEG): container finished" podID="6562f5f3-d175-4f16-90d8-00693b027f88" containerID="7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf" exitCode=143 Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.772416 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.776458 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6562f5f3-d175-4f16-90d8-00693b027f88","Type":"ContainerDied","Data":"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f"} Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.776575 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6562f5f3-d175-4f16-90d8-00693b027f88","Type":"ContainerDied","Data":"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf"} Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.776616 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"6562f5f3-d175-4f16-90d8-00693b027f88","Type":"ContainerDied","Data":"92d2f7fe33b1a821dd28768b8567a99bb0f951f901e12add48299a615b1b35f4"} Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.777071 4761 scope.go:117] "RemoveContainer" containerID="0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.815800 4761 scope.go:117] "RemoveContainer" containerID="7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.823123 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.831583 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.850049 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:02 crc kubenswrapper[4761]: E1201 10:53:02.850344 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" containerName="glance-log" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.850362 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" containerName="glance-log" Dec 01 10:53:02 crc kubenswrapper[4761]: E1201 10:53:02.850377 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" containerName="glance-httpd" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.850388 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" containerName="glance-httpd" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.850638 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" containerName="glance-log" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.850662 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" containerName="glance-httpd" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.852381 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.855114 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.858961 4761 scope.go:117] "RemoveContainer" containerID="0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f" Dec 01 10:53:02 crc kubenswrapper[4761]: E1201 10:53:02.859674 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f\": container with ID starting with 0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f not found: ID does not exist" containerID="0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.859956 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f"} err="failed to get container status \"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f\": rpc error: code = NotFound desc = could not find container \"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f\": container with ID starting with 0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f not found: ID does not exist" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.860013 4761 scope.go:117] "RemoveContainer" containerID="7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf" Dec 01 10:53:02 crc kubenswrapper[4761]: E1201 10:53:02.860622 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf\": container with ID starting with 7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf not found: ID does not exist" containerID="7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.860662 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf"} err="failed to get container status \"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf\": rpc error: code = NotFound desc = could not find container \"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf\": container with ID starting with 7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf not found: ID does not exist" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.860687 4761 scope.go:117] "RemoveContainer" containerID="0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.863220 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f"} err="failed to get container status \"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f\": rpc error: code = NotFound desc = could not find container \"0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f\": container with ID starting with 0511da2c2f4fc993d670391f1b4bd9d992384e0fbc7ced319530a6ecf1932e0f not found: ID does not exist" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.863246 4761 scope.go:117] "RemoveContainer" containerID="7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.863535 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf"} err="failed to get container status \"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf\": rpc error: code = NotFound desc = could not find container \"7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf\": container with ID starting with 7da86dfb253687a4b66804ba2d6d1f618defd6600982564f8622eb90a5fbccbf not found: ID does not exist" Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.869781 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:02 crc kubenswrapper[4761]: I1201 10:53:02.957369 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.059169 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.059933 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.059998 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060044 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060077 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060175 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-sys\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060278 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060329 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2xzh\" (UniqueName: \"kubernetes.io/projected/3dbfc272-45cc-42df-ba35-1f70031b0a86-kube-api-access-n2xzh\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060388 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060498 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060584 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-dev\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060734 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060799 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-logs\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060865 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-run\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.060994 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.105947 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.144541 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6562f5f3-d175-4f16-90d8-00693b027f88" path="/var/lib/kubelet/pods/6562f5f3-d175-4f16-90d8-00693b027f88/volumes" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163351 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163463 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163532 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163627 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163643 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163674 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163687 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163673 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163733 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-sys\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163763 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-sys\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163801 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163739 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163883 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2xzh\" (UniqueName: \"kubernetes.io/projected/3dbfc272-45cc-42df-ba35-1f70031b0a86-kube-api-access-n2xzh\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.163981 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.164007 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.164134 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.164198 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-dev\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.164323 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-logs\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.164359 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-run\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.164515 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-run\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.164580 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-dev\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.164534 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.165041 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-logs\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.169013 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.170194 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.189924 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.196050 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2xzh\" (UniqueName: \"kubernetes.io/projected/3dbfc272-45cc-42df-ba35-1f70031b0a86-kube-api-access-n2xzh\") pod \"glance-default-internal-api-0\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.478200 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:03 crc kubenswrapper[4761]: I1201 10:53:03.794425 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:03 crc kubenswrapper[4761]: W1201 10:53:03.805206 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dbfc272_45cc_42df_ba35_1f70031b0a86.slice/crio-e289ab6e67e83de2665fb506ce28d36ed48e3c3563856fca2bd54c34cdb40a9d WatchSource:0}: Error finding container e289ab6e67e83de2665fb506ce28d36ed48e3c3563856fca2bd54c34cdb40a9d: Status 404 returned error can't find the container with id e289ab6e67e83de2665fb506ce28d36ed48e3c3563856fca2bd54c34cdb40a9d Dec 01 10:53:04 crc kubenswrapper[4761]: I1201 10:53:04.803271 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3dbfc272-45cc-42df-ba35-1f70031b0a86","Type":"ContainerStarted","Data":"aabd6f20ef43721cbbebdb4fcc852bc4ca6397526caf4e867912a0f157ad95cf"} Dec 01 10:53:04 crc kubenswrapper[4761]: I1201 10:53:04.803791 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3dbfc272-45cc-42df-ba35-1f70031b0a86","Type":"ContainerStarted","Data":"2a5e06663ee29a958a43e26fffc05dfd35f848e3452429ad8ceff19b664e5861"} Dec 01 10:53:04 crc kubenswrapper[4761]: I1201 10:53:04.803829 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3dbfc272-45cc-42df-ba35-1f70031b0a86","Type":"ContainerStarted","Data":"e289ab6e67e83de2665fb506ce28d36ed48e3c3563856fca2bd54c34cdb40a9d"} Dec 01 10:53:04 crc kubenswrapper[4761]: I1201 10:53:04.836026 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.835998683 podStartE2EDuration="2.835998683s" podCreationTimestamp="2025-12-01 10:53:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:53:04.825667911 +0000 UTC m=+1324.129426575" watchObservedRunningTime="2025-12-01 10:53:04.835998683 +0000 UTC m=+1324.139757337" Dec 01 10:53:10 crc kubenswrapper[4761]: I1201 10:53:10.208211 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:10 crc kubenswrapper[4761]: I1201 10:53:10.208851 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:10 crc kubenswrapper[4761]: I1201 10:53:10.244510 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:10 crc kubenswrapper[4761]: I1201 10:53:10.266146 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:10 crc kubenswrapper[4761]: I1201 10:53:10.866618 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:10 crc kubenswrapper[4761]: I1201 10:53:10.866875 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:12 crc kubenswrapper[4761]: I1201 10:53:12.785959 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:12 crc kubenswrapper[4761]: I1201 10:53:12.787786 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:13 crc kubenswrapper[4761]: I1201 10:53:13.478640 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:13 crc kubenswrapper[4761]: I1201 10:53:13.478712 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:13 crc kubenswrapper[4761]: I1201 10:53:13.522501 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:13 crc kubenswrapper[4761]: I1201 10:53:13.561757 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:13 crc kubenswrapper[4761]: I1201 10:53:13.912369 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:13 crc kubenswrapper[4761]: I1201 10:53:13.912423 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:15 crc kubenswrapper[4761]: I1201 10:53:15.742248 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:15 crc kubenswrapper[4761]: I1201 10:53:15.789169 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.223177 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.225173 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.235104 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.237137 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.260845 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.277470 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.347113 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.349871 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.356327 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.357766 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.385166 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.392198 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.407834 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.407906 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.407945 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408014 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408064 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408095 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-scripts\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408125 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-dev\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408157 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-run\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408195 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9spr\" (UniqueName: \"kubernetes.io/projected/cea9adc0-9cd1-4b76-b738-a43491864db2-kube-api-access-l9spr\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408273 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djd97\" (UniqueName: \"kubernetes.io/projected/49872107-a7f9-41a8-8277-a50c1a74d521-kube-api-access-djd97\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408315 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-logs\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408343 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-sys\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408376 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-config-data\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408422 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-run\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408558 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408582 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-scripts\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408616 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-sys\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408640 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408660 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408686 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408708 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408734 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408756 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-config-data\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408775 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-logs\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408795 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408816 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-dev\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408838 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.408861 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510519 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510587 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-sys\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510618 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510637 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510658 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-config-data\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510685 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510709 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gd49\" (UniqueName: \"kubernetes.io/projected/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-kube-api-access-7gd49\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510730 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510754 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510779 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510806 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510827 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-scripts\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510845 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-dev\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510868 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-run\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510891 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9spr\" (UniqueName: \"kubernetes.io/projected/cea9adc0-9cd1-4b76-b738-a43491864db2-kube-api-access-l9spr\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510911 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510934 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510957 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djd97\" (UniqueName: \"kubernetes.io/projected/49872107-a7f9-41a8-8277-a50c1a74d521-kube-api-access-djd97\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510977 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-scripts\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.510998 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-dev\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511018 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-dev\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511039 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-logs\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511058 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-sys\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511079 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-config-data\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511100 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-run\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511117 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-sys\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511136 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qkr2\" (UniqueName: \"kubernetes.io/projected/9a651dda-a109-4fb3-850a-5f8a7f210d8d-kube-api-access-8qkr2\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511157 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-logs\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511179 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511201 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511226 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-scripts\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511252 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-sys\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511277 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511303 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511326 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511347 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511372 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-logs\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511401 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511420 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511442 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511463 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511486 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-config-data\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511540 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-logs\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511598 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511648 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-dev\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511668 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511691 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511713 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511736 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511766 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511811 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511838 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511873 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511894 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511924 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-run\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.511950 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-run\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.512013 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.512647 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-run\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.512760 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.512760 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-dev\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513062 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-sys\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513057 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513161 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513212 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513526 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513584 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-dev\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513633 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513634 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513663 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-run\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513676 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513689 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-sys\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513764 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513856 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513870 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.513954 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-logs\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.514000 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.514227 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-logs\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.519453 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.519906 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-scripts\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.524843 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-config-data\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.526290 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-scripts\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.528323 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-config-data\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.546799 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djd97\" (UniqueName: \"kubernetes.io/projected/49872107-a7f9-41a8-8277-a50c1a74d521-kube-api-access-djd97\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.548231 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.560930 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9spr\" (UniqueName: \"kubernetes.io/projected/cea9adc0-9cd1-4b76-b738-a43491864db2-kube-api-access-l9spr\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.566223 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.573179 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.576928 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.578161 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-2\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.595023 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614076 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614477 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614523 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-logs\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614602 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614651 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614686 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614746 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614777 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614803 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-run\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614842 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-run\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614878 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-sys\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614906 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614943 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614973 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615005 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-config-data\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615040 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gd49\" (UniqueName: \"kubernetes.io/projected/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-kube-api-access-7gd49\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615072 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615106 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615137 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615185 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615221 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615228 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-run\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615259 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-scripts\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615306 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-dev\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615339 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-dev\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615349 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-sys\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615381 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-sys\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615418 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qkr2\" (UniqueName: \"kubernetes.io/projected/9a651dda-a109-4fb3-850a-5f8a7f210d8d-kube-api-access-8qkr2\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615437 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-run\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615452 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615481 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-logs\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615750 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615813 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.614271 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615841 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615878 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-dev\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615237 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616010 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-dev\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616024 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615858 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-sys\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.615904 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616089 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616156 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616337 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-logs\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616408 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616477 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616538 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.616920 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-logs\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.617272 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.621235 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.621268 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.624468 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.627120 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-config-data\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.634985 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-scripts\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.642270 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qkr2\" (UniqueName: \"kubernetes.io/projected/9a651dda-a109-4fb3-850a-5f8a7f210d8d-kube-api-access-8qkr2\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.653813 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.655708 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.656992 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.661909 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gd49\" (UniqueName: \"kubernetes.io/projected/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-kube-api-access-7gd49\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.682773 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-2\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.686027 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:18 crc kubenswrapper[4761]: I1201 10:53:18.973776 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.091758 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.098326 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:53:19 crc kubenswrapper[4761]: W1201 10:53:19.122919 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49872107_a7f9_41a8_8277_a50c1a74d521.slice/crio-f7d4d9aa48b97c1024ce1894e7788651335014040b7ea28dafee9482b6a6bfeb WatchSource:0}: Error finding container f7d4d9aa48b97c1024ce1894e7788651335014040b7ea28dafee9482b6a6bfeb: Status 404 returned error can't find the container with id f7d4d9aa48b97c1024ce1894e7788651335014040b7ea28dafee9482b6a6bfeb Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.174335 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:53:19 crc kubenswrapper[4761]: W1201 10:53:19.199994 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a651dda_a109_4fb3_850a_5f8a7f210d8d.slice/crio-105cf848ebfd0c70fc2bc5f5973a8d40e8258786092baab00d251c5a30191fc0 WatchSource:0}: Error finding container 105cf848ebfd0c70fc2bc5f5973a8d40e8258786092baab00d251c5a30191fc0: Status 404 returned error can't find the container with id 105cf848ebfd0c70fc2bc5f5973a8d40e8258786092baab00d251c5a30191fc0 Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.446182 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.971449 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"9a651dda-a109-4fb3-850a-5f8a7f210d8d","Type":"ContainerStarted","Data":"80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.972065 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"9a651dda-a109-4fb3-850a-5f8a7f210d8d","Type":"ContainerStarted","Data":"2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.972080 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"9a651dda-a109-4fb3-850a-5f8a7f210d8d","Type":"ContainerStarted","Data":"105cf848ebfd0c70fc2bc5f5973a8d40e8258786092baab00d251c5a30191fc0"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.976148 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"cea9adc0-9cd1-4b76-b738-a43491864db2","Type":"ContainerStarted","Data":"b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.976214 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"cea9adc0-9cd1-4b76-b738-a43491864db2","Type":"ContainerStarted","Data":"3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.976235 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"cea9adc0-9cd1-4b76-b738-a43491864db2","Type":"ContainerStarted","Data":"0a223ff342a51f77fccc1c510f7271a06ba099dbb8efb31e6a2b048a3c278cd3"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.978796 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0","Type":"ContainerStarted","Data":"8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.978834 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0","Type":"ContainerStarted","Data":"e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.978843 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0","Type":"ContainerStarted","Data":"cda12bedc902a8955b89c33fe79fa774d33884e92c6fb479c72e8f727c952a4d"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.981123 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"49872107-a7f9-41a8-8277-a50c1a74d521","Type":"ContainerStarted","Data":"66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.981150 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"49872107-a7f9-41a8-8277-a50c1a74d521","Type":"ContainerStarted","Data":"de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737"} Dec 01 10:53:19 crc kubenswrapper[4761]: I1201 10:53:19.981161 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"49872107-a7f9-41a8-8277-a50c1a74d521","Type":"ContainerStarted","Data":"f7d4d9aa48b97c1024ce1894e7788651335014040b7ea28dafee9482b6a6bfeb"} Dec 01 10:53:20 crc kubenswrapper[4761]: I1201 10:53:20.017861 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.017841318 podStartE2EDuration="3.017841318s" podCreationTimestamp="2025-12-01 10:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:53:20.001586014 +0000 UTC m=+1339.305344638" watchObservedRunningTime="2025-12-01 10:53:20.017841318 +0000 UTC m=+1339.321599942" Dec 01 10:53:20 crc kubenswrapper[4761]: I1201 10:53:20.047387 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.047370074 podStartE2EDuration="3.047370074s" podCreationTimestamp="2025-12-01 10:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:53:20.033005622 +0000 UTC m=+1339.336764256" watchObservedRunningTime="2025-12-01 10:53:20.047370074 +0000 UTC m=+1339.351128688" Dec 01 10:53:20 crc kubenswrapper[4761]: I1201 10:53:20.068419 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.068399658 podStartE2EDuration="3.068399658s" podCreationTimestamp="2025-12-01 10:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:53:20.063983898 +0000 UTC m=+1339.367742522" watchObservedRunningTime="2025-12-01 10:53:20.068399658 +0000 UTC m=+1339.372158292" Dec 01 10:53:20 crc kubenswrapper[4761]: I1201 10:53:20.092024 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.092005543 podStartE2EDuration="3.092005543s" podCreationTimestamp="2025-12-01 10:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:53:20.083051328 +0000 UTC m=+1339.386809952" watchObservedRunningTime="2025-12-01 10:53:20.092005543 +0000 UTC m=+1339.395764177" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.578609 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.580223 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.596291 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.596342 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.614675 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.638902 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.643424 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.652277 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.686205 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.686254 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.713439 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.721805 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.976859 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:28 crc kubenswrapper[4761]: I1201 10:53:28.976936 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.017755 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.044721 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.079841 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.080148 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.080163 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.080174 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.080471 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.080716 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.082747 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:29 crc kubenswrapper[4761]: I1201 10:53:29.082796 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:30 crc kubenswrapper[4761]: I1201 10:53:30.897389 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:30 crc kubenswrapper[4761]: I1201 10:53:30.941507 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:30 crc kubenswrapper[4761]: I1201 10:53:30.973965 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:30 crc kubenswrapper[4761]: I1201 10:53:30.982632 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:31 crc kubenswrapper[4761]: I1201 10:53:31.073942 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:31 crc kubenswrapper[4761]: I1201 10:53:31.086013 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:53:31 crc kubenswrapper[4761]: I1201 10:53:31.086075 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:53:31 crc kubenswrapper[4761]: I1201 10:53:31.086096 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:53:31 crc kubenswrapper[4761]: I1201 10:53:31.101993 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:31 crc kubenswrapper[4761]: I1201 10:53:31.114700 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:31 crc kubenswrapper[4761]: I1201 10:53:31.119064 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:31 crc kubenswrapper[4761]: I1201 10:53:31.992279 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:53:32 crc kubenswrapper[4761]: I1201 10:53:32.002767 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:53:32 crc kubenswrapper[4761]: I1201 10:53:32.173767 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:53:32 crc kubenswrapper[4761]: I1201 10:53:32.192485 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:53:33 crc kubenswrapper[4761]: I1201 10:53:33.101784 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerName="glance-log" containerID="cri-o://3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035" gracePeriod=30 Dec 01 10:53:33 crc kubenswrapper[4761]: I1201 10:53:33.101980 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerName="glance-httpd" containerID="cri-o://b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5" gracePeriod=30 Dec 01 10:53:33 crc kubenswrapper[4761]: I1201 10:53:33.102367 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerName="glance-log" containerID="cri-o://2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab" gracePeriod=30 Dec 01 10:53:33 crc kubenswrapper[4761]: I1201 10:53:33.102602 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerName="glance-httpd" containerID="cri-o://80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d" gracePeriod=30 Dec 01 10:53:33 crc kubenswrapper[4761]: I1201 10:53:33.103985 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerName="glance-log" containerID="cri-o://e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb" gracePeriod=30 Dec 01 10:53:33 crc kubenswrapper[4761]: I1201 10:53:33.104164 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerName="glance-httpd" containerID="cri-o://8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396" gracePeriod=30 Dec 01 10:53:33 crc kubenswrapper[4761]: I1201 10:53:33.104702 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" containerName="glance-httpd" containerID="cri-o://66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63" gracePeriod=30 Dec 01 10:53:33 crc kubenswrapper[4761]: I1201 10:53:33.104819 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" containerName="glance-log" containerID="cri-o://de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737" gracePeriod=30 Dec 01 10:53:34 crc kubenswrapper[4761]: I1201 10:53:34.112485 4761 generic.go:334] "Generic (PLEG): container finished" podID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerID="3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035" exitCode=143 Dec 01 10:53:34 crc kubenswrapper[4761]: I1201 10:53:34.112780 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"cea9adc0-9cd1-4b76-b738-a43491864db2","Type":"ContainerDied","Data":"3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035"} Dec 01 10:53:34 crc kubenswrapper[4761]: I1201 10:53:34.115471 4761 generic.go:334] "Generic (PLEG): container finished" podID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerID="e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb" exitCode=143 Dec 01 10:53:34 crc kubenswrapper[4761]: I1201 10:53:34.115560 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0","Type":"ContainerDied","Data":"e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb"} Dec 01 10:53:34 crc kubenswrapper[4761]: I1201 10:53:34.117617 4761 generic.go:334] "Generic (PLEG): container finished" podID="49872107-a7f9-41a8-8277-a50c1a74d521" containerID="de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737" exitCode=143 Dec 01 10:53:34 crc kubenswrapper[4761]: I1201 10:53:34.117671 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"49872107-a7f9-41a8-8277-a50c1a74d521","Type":"ContainerDied","Data":"de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737"} Dec 01 10:53:34 crc kubenswrapper[4761]: I1201 10:53:34.119724 4761 generic.go:334] "Generic (PLEG): container finished" podID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerID="2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab" exitCode=143 Dec 01 10:53:34 crc kubenswrapper[4761]: I1201 10:53:34.119752 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"9a651dda-a109-4fb3-850a-5f8a7f210d8d","Type":"ContainerDied","Data":"2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab"} Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.718034 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.726078 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.732771 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.761273 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.830999 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-var-locks-brick\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831073 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-logs\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831099 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-iscsi\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831123 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9spr\" (UniqueName: \"kubernetes.io/projected/cea9adc0-9cd1-4b76-b738-a43491864db2-kube-api-access-l9spr\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831145 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-var-locks-brick\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831166 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-scripts\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831181 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-dev\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831198 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-logs\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831222 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-config-data\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831239 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-sys\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831264 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djd97\" (UniqueName: \"kubernetes.io/projected/49872107-a7f9-41a8-8277-a50c1a74d521-kube-api-access-djd97\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831280 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-httpd-run\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831297 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-run\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831319 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-iscsi\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831337 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-lib-modules\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831356 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831375 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-run\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831392 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-config-data\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831409 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-sys\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831424 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-dev\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831441 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-iscsi\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831454 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831479 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-config-data\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831492 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-dev\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831509 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-sys\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831523 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-logs\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831566 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-nvme\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831579 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831592 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-var-locks-brick\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831611 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-httpd-run\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831631 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831646 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831676 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-run\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831697 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-scripts\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831714 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-httpd-run\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831729 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-lib-modules\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831743 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831756 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-lib-modules\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831776 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-scripts\") pod \"cea9adc0-9cd1-4b76-b738-a43491864db2\" (UID: \"cea9adc0-9cd1-4b76-b738-a43491864db2\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831792 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-nvme\") pod \"49872107-a7f9-41a8-8277-a50c1a74d521\" (UID: \"49872107-a7f9-41a8-8277-a50c1a74d521\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831808 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-nvme\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.831830 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qkr2\" (UniqueName: \"kubernetes.io/projected/9a651dda-a109-4fb3-850a-5f8a7f210d8d-kube-api-access-8qkr2\") pod \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\" (UID: \"9a651dda-a109-4fb3-850a-5f8a7f210d8d\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832140 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-dev" (OuterVolumeSpecName: "dev") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832189 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832223 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-sys" (OuterVolumeSpecName: "sys") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832227 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-sys" (OuterVolumeSpecName: "sys") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832247 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-dev" (OuterVolumeSpecName: "dev") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832255 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832568 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-logs" (OuterVolumeSpecName: "logs") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832568 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832604 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832627 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-run" (OuterVolumeSpecName: "run") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.832644 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-sys" (OuterVolumeSpecName: "sys") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.833173 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.833223 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.834451 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.834479 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-logs" (OuterVolumeSpecName: "logs") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.834692 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.834739 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.837580 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.837635 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-dev" (OuterVolumeSpecName: "dev") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.837939 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-logs" (OuterVolumeSpecName: "logs") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.837971 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-run" (OuterVolumeSpecName: "run") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.838006 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.838042 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49872107-a7f9-41a8-8277-a50c1a74d521-kube-api-access-djd97" (OuterVolumeSpecName: "kube-api-access-djd97") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "kube-api-access-djd97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.838084 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.838301 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.838328 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-run" (OuterVolumeSpecName: "run") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.838347 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.838760 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.840404 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.840431 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.840524 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.840659 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a651dda-a109-4fb3-850a-5f8a7f210d8d-kube-api-access-8qkr2" (OuterVolumeSpecName: "kube-api-access-8qkr2") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "kube-api-access-8qkr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.840903 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-scripts" (OuterVolumeSpecName: "scripts") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.841078 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-scripts" (OuterVolumeSpecName: "scripts") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.841265 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.841919 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea9adc0-9cd1-4b76-b738-a43491864db2-kube-api-access-l9spr" (OuterVolumeSpecName: "kube-api-access-l9spr") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "kube-api-access-l9spr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.842030 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-scripts" (OuterVolumeSpecName: "scripts") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.842385 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.843718 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.867694 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-config-data" (OuterVolumeSpecName: "config-data") pod "9a651dda-a109-4fb3-850a-5f8a7f210d8d" (UID: "9a651dda-a109-4fb3-850a-5f8a7f210d8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.871025 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-config-data" (OuterVolumeSpecName: "config-data") pod "49872107-a7f9-41a8-8277-a50c1a74d521" (UID: "49872107-a7f9-41a8-8277-a50c1a74d521"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.876957 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-config-data" (OuterVolumeSpecName: "config-data") pod "cea9adc0-9cd1-4b76-b738-a43491864db2" (UID: "cea9adc0-9cd1-4b76-b738-a43491864db2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932613 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-nvme\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932676 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932721 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-iscsi\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932741 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gd49\" (UniqueName: \"kubernetes.io/projected/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-kube-api-access-7gd49\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932756 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-dev\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932795 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-var-locks-brick\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932835 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-config-data\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932857 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-httpd-run\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932877 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-sys\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932919 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-scripts\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932938 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-run\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932954 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932967 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-lib-modules\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.932981 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-logs\") pod \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\" (UID: \"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0\") " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.933778 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.933792 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9spr\" (UniqueName: \"kubernetes.io/projected/cea9adc0-9cd1-4b76-b738-a43491864db2-kube-api-access-l9spr\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.933782 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.933822 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-dev" (OuterVolumeSpecName: "dev") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.933784 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.934250 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-logs" (OuterVolumeSpecName: "logs") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.934436 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-run" (OuterVolumeSpecName: "run") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.934472 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-sys" (OuterVolumeSpecName: "sys") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.934674 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.933803 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936424 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936448 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936351 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936459 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936401 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936434 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936471 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936518 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936576 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djd97\" (UniqueName: \"kubernetes.io/projected/49872107-a7f9-41a8-8277-a50c1a74d521-kube-api-access-djd97\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936589 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936602 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936614 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936625 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936703 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936754 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936766 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936778 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936789 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936799 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936815 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936825 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49872107-a7f9-41a8-8277-a50c1a74d521-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936835 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936862 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936873 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936883 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cea9adc0-9cd1-4b76-b738-a43491864db2-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936897 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936908 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936920 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a651dda-a109-4fb3-850a-5f8a7f210d8d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936935 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936950 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936961 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936972 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a651dda-a109-4fb3-850a-5f8a7f210d8d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936983 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49872107-a7f9-41a8-8277-a50c1a74d521-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.936998 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.937013 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.937025 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.937036 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea9adc0-9cd1-4b76-b738-a43491864db2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.937047 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.937059 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a651dda-a109-4fb3-850a-5f8a7f210d8d-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.937073 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qkr2\" (UniqueName: \"kubernetes.io/projected/9a651dda-a109-4fb3-850a-5f8a7f210d8d-kube-api-access-8qkr2\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.937085 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49872107-a7f9-41a8-8277-a50c1a74d521-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.937096 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea9adc0-9cd1-4b76-b738-a43491864db2-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.938377 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-scripts" (OuterVolumeSpecName: "scripts") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.938885 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.946617 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-kube-api-access-7gd49" (OuterVolumeSpecName: "kube-api-access-7gd49") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "kube-api-access-7gd49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.952856 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.955932 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.956114 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.957810 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.965009 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.965563 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:53:36 crc kubenswrapper[4761]: I1201 10:53:36.987466 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-config-data" (OuterVolumeSpecName: "config-data") pod "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" (UID: "2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038765 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038803 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038815 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038825 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038835 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038847 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gd49\" (UniqueName: \"kubernetes.io/projected/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-kube-api-access-7gd49\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038858 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038867 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038876 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038887 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038896 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038905 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038913 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038922 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038937 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038947 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038958 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038968 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038977 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.038987 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.058441 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.064647 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.146392 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.146421 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.182564 4761 generic.go:334] "Generic (PLEG): container finished" podID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerID="8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396" exitCode=0 Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.182634 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.182634 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0","Type":"ContainerDied","Data":"8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396"} Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.182782 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0","Type":"ContainerDied","Data":"cda12bedc902a8955b89c33fe79fa774d33884e92c6fb479c72e8f727c952a4d"} Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.182820 4761 scope.go:117] "RemoveContainer" containerID="8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.186271 4761 generic.go:334] "Generic (PLEG): container finished" podID="49872107-a7f9-41a8-8277-a50c1a74d521" containerID="66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63" exitCode=0 Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.186351 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"49872107-a7f9-41a8-8277-a50c1a74d521","Type":"ContainerDied","Data":"66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63"} Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.186382 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"49872107-a7f9-41a8-8277-a50c1a74d521","Type":"ContainerDied","Data":"f7d4d9aa48b97c1024ce1894e7788651335014040b7ea28dafee9482b6a6bfeb"} Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.186407 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.189453 4761 generic.go:334] "Generic (PLEG): container finished" podID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerID="80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d" exitCode=0 Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.189501 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"9a651dda-a109-4fb3-850a-5f8a7f210d8d","Type":"ContainerDied","Data":"80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d"} Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.189523 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"9a651dda-a109-4fb3-850a-5f8a7f210d8d","Type":"ContainerDied","Data":"105cf848ebfd0c70fc2bc5f5973a8d40e8258786092baab00d251c5a30191fc0"} Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.189599 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.192928 4761 generic.go:334] "Generic (PLEG): container finished" podID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerID="b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5" exitCode=0 Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.192955 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"cea9adc0-9cd1-4b76-b738-a43491864db2","Type":"ContainerDied","Data":"b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5"} Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.192972 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"cea9adc0-9cd1-4b76-b738-a43491864db2","Type":"ContainerDied","Data":"0a223ff342a51f77fccc1c510f7271a06ba099dbb8efb31e6a2b048a3c278cd3"} Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.193024 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.213814 4761 scope.go:117] "RemoveContainer" containerID="e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.219354 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.224265 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.233610 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.241188 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.246083 4761 scope.go:117] "RemoveContainer" containerID="8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.247194 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:53:37 crc kubenswrapper[4761]: E1201 10:53:37.247985 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396\": container with ID starting with 8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396 not found: ID does not exist" containerID="8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.248098 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396"} err="failed to get container status \"8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396\": rpc error: code = NotFound desc = could not find container \"8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396\": container with ID starting with 8e03f17dfcebfd06e7d3f7b5fb9be5aa284cb07a8c9bece5bb80f9b7b200b396 not found: ID does not exist" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.248176 4761 scope.go:117] "RemoveContainer" containerID="e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb" Dec 01 10:53:37 crc kubenswrapper[4761]: E1201 10:53:37.248914 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb\": container with ID starting with e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb not found: ID does not exist" containerID="e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.248935 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb"} err="failed to get container status \"e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb\": rpc error: code = NotFound desc = could not find container \"e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb\": container with ID starting with e749c76a7c1e34706ffddb4cb4bd400c874421facf17c0abf43ae46e2ce1d8bb not found: ID does not exist" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.248949 4761 scope.go:117] "RemoveContainer" containerID="66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.255849 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.260836 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.265192 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.268658 4761 scope.go:117] "RemoveContainer" containerID="de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.286082 4761 scope.go:117] "RemoveContainer" containerID="66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63" Dec 01 10:53:37 crc kubenswrapper[4761]: E1201 10:53:37.286515 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63\": container with ID starting with 66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63 not found: ID does not exist" containerID="66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.286541 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63"} err="failed to get container status \"66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63\": rpc error: code = NotFound desc = could not find container \"66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63\": container with ID starting with 66b3b80f64d0078fb5a2d0b6ed7afb175dd2f1da3978f1eb248605d400f36e63 not found: ID does not exist" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.286576 4761 scope.go:117] "RemoveContainer" containerID="de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737" Dec 01 10:53:37 crc kubenswrapper[4761]: E1201 10:53:37.286874 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737\": container with ID starting with de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737 not found: ID does not exist" containerID="de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.286896 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737"} err="failed to get container status \"de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737\": rpc error: code = NotFound desc = could not find container \"de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737\": container with ID starting with de6390ba48eeb4a82a6a4013c23e1f1cd778e0c9307e529f8b0b237749343737 not found: ID does not exist" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.286907 4761 scope.go:117] "RemoveContainer" containerID="80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.304902 4761 scope.go:117] "RemoveContainer" containerID="2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.322556 4761 scope.go:117] "RemoveContainer" containerID="80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d" Dec 01 10:53:37 crc kubenswrapper[4761]: E1201 10:53:37.322908 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d\": container with ID starting with 80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d not found: ID does not exist" containerID="80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.322933 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d"} err="failed to get container status \"80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d\": rpc error: code = NotFound desc = could not find container \"80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d\": container with ID starting with 80f8864e61029eec4009f8f2f9159cc201b86bee620472d0d0cc7d2d0ccca37d not found: ID does not exist" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.322953 4761 scope.go:117] "RemoveContainer" containerID="2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab" Dec 01 10:53:37 crc kubenswrapper[4761]: E1201 10:53:37.323243 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab\": container with ID starting with 2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab not found: ID does not exist" containerID="2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.323260 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab"} err="failed to get container status \"2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab\": rpc error: code = NotFound desc = could not find container \"2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab\": container with ID starting with 2ffae866496000ea2964a59a88546cb4bb7354bdf36e13e408f40bfab93195ab not found: ID does not exist" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.323271 4761 scope.go:117] "RemoveContainer" containerID="b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.339413 4761 scope.go:117] "RemoveContainer" containerID="3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.418911 4761 scope.go:117] "RemoveContainer" containerID="b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5" Dec 01 10:53:37 crc kubenswrapper[4761]: E1201 10:53:37.419582 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5\": container with ID starting with b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5 not found: ID does not exist" containerID="b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.419669 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5"} err="failed to get container status \"b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5\": rpc error: code = NotFound desc = could not find container \"b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5\": container with ID starting with b18a99a6fcf187208be722f21fe408e60ba85be36582aee4f3278bb6c3102dc5 not found: ID does not exist" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.419715 4761 scope.go:117] "RemoveContainer" containerID="3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035" Dec 01 10:53:37 crc kubenswrapper[4761]: E1201 10:53:37.420090 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035\": container with ID starting with 3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035 not found: ID does not exist" containerID="3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035" Dec 01 10:53:37 crc kubenswrapper[4761]: I1201 10:53:37.420137 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035"} err="failed to get container status \"3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035\": rpc error: code = NotFound desc = could not find container \"3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035\": container with ID starting with 3d95f9ab1eba6fdeaef3d833eb3ec14021795425bbc5fb8c0bda470dcd1a1035 not found: ID does not exist" Dec 01 10:53:38 crc kubenswrapper[4761]: I1201 10:53:38.516728 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:53:38 crc kubenswrapper[4761]: I1201 10:53:38.517211 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerName="glance-log" containerID="cri-o://53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89" gracePeriod=30 Dec 01 10:53:38 crc kubenswrapper[4761]: I1201 10:53:38.517315 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerName="glance-httpd" containerID="cri-o://f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4" gracePeriod=30 Dec 01 10:53:38 crc kubenswrapper[4761]: I1201 10:53:38.780606 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:38 crc kubenswrapper[4761]: I1201 10:53:38.780998 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerName="glance-log" containerID="cri-o://2a5e06663ee29a958a43e26fffc05dfd35f848e3452429ad8ceff19b664e5861" gracePeriod=30 Dec 01 10:53:38 crc kubenswrapper[4761]: I1201 10:53:38.781086 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerName="glance-httpd" containerID="cri-o://aabd6f20ef43721cbbebdb4fcc852bc4ca6397526caf4e867912a0f157ad95cf" gracePeriod=30 Dec 01 10:53:39 crc kubenswrapper[4761]: I1201 10:53:39.138966 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" path="/var/lib/kubelet/pods/2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0/volumes" Dec 01 10:53:39 crc kubenswrapper[4761]: I1201 10:53:39.139631 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" path="/var/lib/kubelet/pods/49872107-a7f9-41a8-8277-a50c1a74d521/volumes" Dec 01 10:53:39 crc kubenswrapper[4761]: I1201 10:53:39.140207 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" path="/var/lib/kubelet/pods/9a651dda-a109-4fb3-850a-5f8a7f210d8d/volumes" Dec 01 10:53:39 crc kubenswrapper[4761]: I1201 10:53:39.140796 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" path="/var/lib/kubelet/pods/cea9adc0-9cd1-4b76-b738-a43491864db2/volumes" Dec 01 10:53:39 crc kubenswrapper[4761]: I1201 10:53:39.227506 4761 generic.go:334] "Generic (PLEG): container finished" podID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerID="53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89" exitCode=143 Dec 01 10:53:39 crc kubenswrapper[4761]: I1201 10:53:39.227606 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ac516c9e-a80c-48c3-9a29-809f073fa66f","Type":"ContainerDied","Data":"53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89"} Dec 01 10:53:39 crc kubenswrapper[4761]: I1201 10:53:39.230226 4761 generic.go:334] "Generic (PLEG): container finished" podID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerID="2a5e06663ee29a958a43e26fffc05dfd35f848e3452429ad8ceff19b664e5861" exitCode=143 Dec 01 10:53:39 crc kubenswrapper[4761]: I1201 10:53:39.230255 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3dbfc272-45cc-42df-ba35-1f70031b0a86","Type":"ContainerDied","Data":"2a5e06663ee29a958a43e26fffc05dfd35f848e3452429ad8ceff19b664e5861"} Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.101859 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228670 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-nvme\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228718 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-iscsi\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228739 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-dev\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228770 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-scripts\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228820 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-run\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228818 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228845 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-lib-modules\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228868 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-sys\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228870 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-dev" (OuterVolumeSpecName: "dev") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228907 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjll2\" (UniqueName: \"kubernetes.io/projected/ac516c9e-a80c-48c3-9a29-809f073fa66f-kube-api-access-mjll2\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228916 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-run" (OuterVolumeSpecName: "run") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228942 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228943 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228953 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-sys" (OuterVolumeSpecName: "sys") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228967 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-config-data\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.228989 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-var-locks-brick\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229013 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-logs\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229076 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-httpd-run\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229109 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ac516c9e-a80c-48c3-9a29-809f073fa66f\" (UID: \"ac516c9e-a80c-48c3-9a29-809f073fa66f\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229433 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229446 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229457 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229468 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229478 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229610 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.229617 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.230036 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-logs" (OuterVolumeSpecName: "logs") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.230374 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.234723 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.234966 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.235685 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-scripts" (OuterVolumeSpecName: "scripts") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.238032 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac516c9e-a80c-48c3-9a29-809f073fa66f-kube-api-access-mjll2" (OuterVolumeSpecName: "kube-api-access-mjll2") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "kube-api-access-mjll2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.264299 4761 generic.go:334] "Generic (PLEG): container finished" podID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerID="f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4" exitCode=0 Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.264383 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ac516c9e-a80c-48c3-9a29-809f073fa66f","Type":"ContainerDied","Data":"f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4"} Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.264425 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"ac516c9e-a80c-48c3-9a29-809f073fa66f","Type":"ContainerDied","Data":"50dbf3f56a4f7ae5f4579327bf42cb284b618549d1ffb009e81b5c598d41dcb8"} Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.264446 4761 scope.go:117] "RemoveContainer" containerID="f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.264863 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.267657 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3dbfc272-45cc-42df-ba35-1f70031b0a86","Type":"ContainerDied","Data":"aabd6f20ef43721cbbebdb4fcc852bc4ca6397526caf4e867912a0f157ad95cf"} Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.267517 4761 generic.go:334] "Generic (PLEG): container finished" podID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerID="aabd6f20ef43721cbbebdb4fcc852bc4ca6397526caf4e867912a0f157ad95cf" exitCode=0 Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.270971 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-config-data" (OuterVolumeSpecName: "config-data") pod "ac516c9e-a80c-48c3-9a29-809f073fa66f" (UID: "ac516c9e-a80c-48c3-9a29-809f073fa66f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.281704 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.287247 4761 scope.go:117] "RemoveContainer" containerID="53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.311603 4761 scope.go:117] "RemoveContainer" containerID="f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4" Dec 01 10:53:42 crc kubenswrapper[4761]: E1201 10:53:42.321721 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4\": container with ID starting with f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4 not found: ID does not exist" containerID="f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.321780 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4"} err="failed to get container status \"f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4\": rpc error: code = NotFound desc = could not find container \"f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4\": container with ID starting with f9d8c3621057e3580a72047d92fdd79bccebde507b4695a4251544bb7b6d75b4 not found: ID does not exist" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.321812 4761 scope.go:117] "RemoveContainer" containerID="53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89" Dec 01 10:53:42 crc kubenswrapper[4761]: E1201 10:53:42.323078 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89\": container with ID starting with 53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89 not found: ID does not exist" containerID="53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.323132 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89"} err="failed to get container status \"53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89\": rpc error: code = NotFound desc = could not find container \"53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89\": container with ID starting with 53c28b2686b53ecf74300d38a531702038cc5e1451d5d2d303352404615cfa89 not found: ID does not exist" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331056 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjll2\" (UniqueName: \"kubernetes.io/projected/ac516c9e-a80c-48c3-9a29-809f073fa66f-kube-api-access-mjll2\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331109 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331125 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331139 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331151 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331161 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac516c9e-a80c-48c3-9a29-809f073fa66f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331177 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331188 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac516c9e-a80c-48c3-9a29-809f073fa66f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.331198 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac516c9e-a80c-48c3-9a29-809f073fa66f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.346000 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.347595 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431674 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-sys\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431728 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-nvme\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431749 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-dev\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431805 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-run\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431834 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-scripts\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431881 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2xzh\" (UniqueName: \"kubernetes.io/projected/3dbfc272-45cc-42df-ba35-1f70031b0a86-kube-api-access-n2xzh\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431884 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-sys" (OuterVolumeSpecName: "sys") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431928 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431936 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-run" (OuterVolumeSpecName: "run") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431973 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431994 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.431964 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-lib-modules\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.432142 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-config-data\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.432216 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-iscsi\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.432274 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-logs\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.432314 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-httpd-run\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.432344 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-var-locks-brick\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.432373 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"3dbfc272-45cc-42df-ba35-1f70031b0a86\" (UID: \"3dbfc272-45cc-42df-ba35-1f70031b0a86\") " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.432647 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-dev" (OuterVolumeSpecName: "dev") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.432841 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-logs" (OuterVolumeSpecName: "logs") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433109 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433234 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433172 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433255 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433207 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433266 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433268 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433331 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433354 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433373 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.433390 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.434846 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dbfc272-45cc-42df-ba35-1f70031b0a86-kube-api-access-n2xzh" (OuterVolumeSpecName: "kube-api-access-n2xzh") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "kube-api-access-n2xzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.436197 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.436533 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-scripts" (OuterVolumeSpecName: "scripts") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.439072 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.493301 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-config-data" (OuterVolumeSpecName: "config-data") pod "3dbfc272-45cc-42df-ba35-1f70031b0a86" (UID: "3dbfc272-45cc-42df-ba35-1f70031b0a86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.535352 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.535408 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.535433 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.535452 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dbfc272-45cc-42df-ba35-1f70031b0a86-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.535470 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3dbfc272-45cc-42df-ba35-1f70031b0a86-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.535502 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.535906 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dbfc272-45cc-42df-ba35-1f70031b0a86-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.535963 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2xzh\" (UniqueName: \"kubernetes.io/projected/3dbfc272-45cc-42df-ba35-1f70031b0a86-kube-api-access-n2xzh\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.562616 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.564353 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.619770 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.632284 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.637526 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:42 crc kubenswrapper[4761]: I1201 10:53:42.637592 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.144735 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" path="/var/lib/kubelet/pods/ac516c9e-a80c-48c3-9a29-809f073fa66f/volumes" Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.281511 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"3dbfc272-45cc-42df-ba35-1f70031b0a86","Type":"ContainerDied","Data":"e289ab6e67e83de2665fb506ce28d36ed48e3c3563856fca2bd54c34cdb40a9d"} Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.281611 4761 scope.go:117] "RemoveContainer" containerID="aabd6f20ef43721cbbebdb4fcc852bc4ca6397526caf4e867912a0f157ad95cf" Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.281657 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.313840 4761 scope.go:117] "RemoveContainer" containerID="2a5e06663ee29a958a43e26fffc05dfd35f848e3452429ad8ceff19b664e5861" Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.315999 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.329211 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.979425 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-hw5vn"] Dec 01 10:53:43 crc kubenswrapper[4761]: I1201 10:53:43.989328 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-hw5vn"] Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.014749 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glanceaf1e-account-delete-s4xrz"] Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015114 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015137 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015163 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015174 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015190 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015200 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015221 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015230 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015247 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015258 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015275 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015322 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015340 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015350 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015365 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015374 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015394 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015403 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015417 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015426 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015438 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015448 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: E1201 10:53:44.015465 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015477 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015682 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015701 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb6bd76-9ad7-4f8c-b696-216d8c3b29f0" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015716 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015732 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea9adc0-9cd1-4b76-b738-a43491864db2" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015744 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015754 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015775 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015786 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015802 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac516c9e-a80c-48c3-9a29-809f073fa66f" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015814 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="49872107-a7f9-41a8-8277-a50c1a74d521" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015830 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerName="glance-log" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.015844 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a651dda-a109-4fb3-850a-5f8a7f210d8d" containerName="glance-httpd" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.016593 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.029832 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glanceaf1e-account-delete-s4xrz"] Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.070459 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41755c48-c4ef-4b03-bccd-c32c05531e7a-operator-scripts\") pod \"glanceaf1e-account-delete-s4xrz\" (UID: \"41755c48-c4ef-4b03-bccd-c32c05531e7a\") " pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.070508 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4h2m\" (UniqueName: \"kubernetes.io/projected/41755c48-c4ef-4b03-bccd-c32c05531e7a-kube-api-access-t4h2m\") pod \"glanceaf1e-account-delete-s4xrz\" (UID: \"41755c48-c4ef-4b03-bccd-c32c05531e7a\") " pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.171774 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41755c48-c4ef-4b03-bccd-c32c05531e7a-operator-scripts\") pod \"glanceaf1e-account-delete-s4xrz\" (UID: \"41755c48-c4ef-4b03-bccd-c32c05531e7a\") " pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.172019 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4h2m\" (UniqueName: \"kubernetes.io/projected/41755c48-c4ef-4b03-bccd-c32c05531e7a-kube-api-access-t4h2m\") pod \"glanceaf1e-account-delete-s4xrz\" (UID: \"41755c48-c4ef-4b03-bccd-c32c05531e7a\") " pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.172851 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41755c48-c4ef-4b03-bccd-c32c05531e7a-operator-scripts\") pod \"glanceaf1e-account-delete-s4xrz\" (UID: \"41755c48-c4ef-4b03-bccd-c32c05531e7a\") " pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.191759 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4h2m\" (UniqueName: \"kubernetes.io/projected/41755c48-c4ef-4b03-bccd-c32c05531e7a-kube-api-access-t4h2m\") pod \"glanceaf1e-account-delete-s4xrz\" (UID: \"41755c48-c4ef-4b03-bccd-c32c05531e7a\") " pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.377935 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:44 crc kubenswrapper[4761]: I1201 10:53:44.824953 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glanceaf1e-account-delete-s4xrz"] Dec 01 10:53:44 crc kubenswrapper[4761]: W1201 10:53:44.829601 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41755c48_c4ef_4b03_bccd_c32c05531e7a.slice/crio-55e8aa8bd3a1c9445faf4861ef96877625184f7f4d09c53bb8f55df607cac63f WatchSource:0}: Error finding container 55e8aa8bd3a1c9445faf4861ef96877625184f7f4d09c53bb8f55df607cac63f: Status 404 returned error can't find the container with id 55e8aa8bd3a1c9445faf4861ef96877625184f7f4d09c53bb8f55df607cac63f Dec 01 10:53:45 crc kubenswrapper[4761]: I1201 10:53:45.142121 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dbfc272-45cc-42df-ba35-1f70031b0a86" path="/var/lib/kubelet/pods/3dbfc272-45cc-42df-ba35-1f70031b0a86/volumes" Dec 01 10:53:45 crc kubenswrapper[4761]: I1201 10:53:45.142871 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a426ca-a5fe-464a-951f-53ae1db79b1e" path="/var/lib/kubelet/pods/d8a426ca-a5fe-464a-951f-53ae1db79b1e/volumes" Dec 01 10:53:45 crc kubenswrapper[4761]: I1201 10:53:45.299756 4761 generic.go:334] "Generic (PLEG): container finished" podID="41755c48-c4ef-4b03-bccd-c32c05531e7a" containerID="9270e8169fdb53f4a7cfa62656a35f20480492955354f4cff741caa91330201b" exitCode=0 Dec 01 10:53:45 crc kubenswrapper[4761]: I1201 10:53:45.299820 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" event={"ID":"41755c48-c4ef-4b03-bccd-c32c05531e7a","Type":"ContainerDied","Data":"9270e8169fdb53f4a7cfa62656a35f20480492955354f4cff741caa91330201b"} Dec 01 10:53:45 crc kubenswrapper[4761]: I1201 10:53:45.301098 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" event={"ID":"41755c48-c4ef-4b03-bccd-c32c05531e7a","Type":"ContainerStarted","Data":"55e8aa8bd3a1c9445faf4861ef96877625184f7f4d09c53bb8f55df607cac63f"} Dec 01 10:53:46 crc kubenswrapper[4761]: I1201 10:53:46.674694 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:46 crc kubenswrapper[4761]: I1201 10:53:46.714656 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41755c48-c4ef-4b03-bccd-c32c05531e7a-operator-scripts\") pod \"41755c48-c4ef-4b03-bccd-c32c05531e7a\" (UID: \"41755c48-c4ef-4b03-bccd-c32c05531e7a\") " Dec 01 10:53:46 crc kubenswrapper[4761]: I1201 10:53:46.714876 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4h2m\" (UniqueName: \"kubernetes.io/projected/41755c48-c4ef-4b03-bccd-c32c05531e7a-kube-api-access-t4h2m\") pod \"41755c48-c4ef-4b03-bccd-c32c05531e7a\" (UID: \"41755c48-c4ef-4b03-bccd-c32c05531e7a\") " Dec 01 10:53:46 crc kubenswrapper[4761]: I1201 10:53:46.715600 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41755c48-c4ef-4b03-bccd-c32c05531e7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41755c48-c4ef-4b03-bccd-c32c05531e7a" (UID: "41755c48-c4ef-4b03-bccd-c32c05531e7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:53:46 crc kubenswrapper[4761]: I1201 10:53:46.720820 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41755c48-c4ef-4b03-bccd-c32c05531e7a-kube-api-access-t4h2m" (OuterVolumeSpecName: "kube-api-access-t4h2m") pod "41755c48-c4ef-4b03-bccd-c32c05531e7a" (UID: "41755c48-c4ef-4b03-bccd-c32c05531e7a"). InnerVolumeSpecName "kube-api-access-t4h2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:46 crc kubenswrapper[4761]: I1201 10:53:46.815611 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4h2m\" (UniqueName: \"kubernetes.io/projected/41755c48-c4ef-4b03-bccd-c32c05531e7a-kube-api-access-t4h2m\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:46 crc kubenswrapper[4761]: I1201 10:53:46.815639 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41755c48-c4ef-4b03-bccd-c32c05531e7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:47 crc kubenswrapper[4761]: I1201 10:53:47.320435 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" event={"ID":"41755c48-c4ef-4b03-bccd-c32c05531e7a","Type":"ContainerDied","Data":"55e8aa8bd3a1c9445faf4861ef96877625184f7f4d09c53bb8f55df607cac63f"} Dec 01 10:53:47 crc kubenswrapper[4761]: I1201 10:53:47.320498 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55e8aa8bd3a1c9445faf4861ef96877625184f7f4d09c53bb8f55df607cac63f" Dec 01 10:53:47 crc kubenswrapper[4761]: I1201 10:53:47.320601 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glanceaf1e-account-delete-s4xrz" Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.047929 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-scc2g"] Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.058043 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-scc2g"] Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.065438 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-af1e-account-create-update-cs6qz"] Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.073244 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glanceaf1e-account-delete-s4xrz"] Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.083951 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glanceaf1e-account-delete-s4xrz"] Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.094117 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-af1e-account-create-update-cs6qz"] Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.139488 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41755c48-c4ef-4b03-bccd-c32c05531e7a" path="/var/lib/kubelet/pods/41755c48-c4ef-4b03-bccd-c32c05531e7a/volumes" Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.140203 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c91560-7b25-4429-94e9-aa13a92bf417" path="/var/lib/kubelet/pods/91c91560-7b25-4429-94e9-aa13a92bf417/volumes" Dec 01 10:53:49 crc kubenswrapper[4761]: I1201 10:53:49.140830 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9664d36-3b98-46d7-8af1-b1636a2f370a" path="/var/lib/kubelet/pods/c9664d36-3b98-46d7-8af1-b1636a2f370a/volumes" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.198687 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-h2nvr"] Dec 01 10:53:50 crc kubenswrapper[4761]: E1201 10:53:50.199171 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41755c48-c4ef-4b03-bccd-c32c05531e7a" containerName="mariadb-account-delete" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.199195 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="41755c48-c4ef-4b03-bccd-c32c05531e7a" containerName="mariadb-account-delete" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.199407 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="41755c48-c4ef-4b03-bccd-c32c05531e7a" containerName="mariadb-account-delete" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.200191 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.221818 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-5333-account-create-update-xkdj2"] Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.224665 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.226994 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.260961 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5333-account-create-update-xkdj2"] Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.270896 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f89j4\" (UniqueName: \"kubernetes.io/projected/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-kube-api-access-f89j4\") pod \"glance-db-create-h2nvr\" (UID: \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\") " pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.271010 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tj8p\" (UniqueName: \"kubernetes.io/projected/6757cc2e-75a3-4e72-86a9-116cf82ee293-kube-api-access-4tj8p\") pod \"glance-5333-account-create-update-xkdj2\" (UID: \"6757cc2e-75a3-4e72-86a9-116cf82ee293\") " pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.271049 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6757cc2e-75a3-4e72-86a9-116cf82ee293-operator-scripts\") pod \"glance-5333-account-create-update-xkdj2\" (UID: \"6757cc2e-75a3-4e72-86a9-116cf82ee293\") " pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.271118 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-operator-scripts\") pod \"glance-db-create-h2nvr\" (UID: \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\") " pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.285713 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-h2nvr"] Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.372255 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f89j4\" (UniqueName: \"kubernetes.io/projected/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-kube-api-access-f89j4\") pod \"glance-db-create-h2nvr\" (UID: \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\") " pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.372348 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tj8p\" (UniqueName: \"kubernetes.io/projected/6757cc2e-75a3-4e72-86a9-116cf82ee293-kube-api-access-4tj8p\") pod \"glance-5333-account-create-update-xkdj2\" (UID: \"6757cc2e-75a3-4e72-86a9-116cf82ee293\") " pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.372378 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6757cc2e-75a3-4e72-86a9-116cf82ee293-operator-scripts\") pod \"glance-5333-account-create-update-xkdj2\" (UID: \"6757cc2e-75a3-4e72-86a9-116cf82ee293\") " pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.372421 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-operator-scripts\") pod \"glance-db-create-h2nvr\" (UID: \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\") " pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.373150 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-operator-scripts\") pod \"glance-db-create-h2nvr\" (UID: \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\") " pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.374219 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6757cc2e-75a3-4e72-86a9-116cf82ee293-operator-scripts\") pod \"glance-5333-account-create-update-xkdj2\" (UID: \"6757cc2e-75a3-4e72-86a9-116cf82ee293\") " pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.397576 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f89j4\" (UniqueName: \"kubernetes.io/projected/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-kube-api-access-f89j4\") pod \"glance-db-create-h2nvr\" (UID: \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\") " pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.397604 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tj8p\" (UniqueName: \"kubernetes.io/projected/6757cc2e-75a3-4e72-86a9-116cf82ee293-kube-api-access-4tj8p\") pod \"glance-5333-account-create-update-xkdj2\" (UID: \"6757cc2e-75a3-4e72-86a9-116cf82ee293\") " pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.539160 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.571297 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:50 crc kubenswrapper[4761]: I1201 10:53:50.821975 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5333-account-create-update-xkdj2"] Dec 01 10:53:51 crc kubenswrapper[4761]: I1201 10:53:51.002026 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-h2nvr"] Dec 01 10:53:51 crc kubenswrapper[4761]: I1201 10:53:51.368159 4761 generic.go:334] "Generic (PLEG): container finished" podID="6757cc2e-75a3-4e72-86a9-116cf82ee293" containerID="ff82762c704ee0d6672b7257584125f1f97359cb54bd7db233dd8b4d58778082" exitCode=0 Dec 01 10:53:51 crc kubenswrapper[4761]: I1201 10:53:51.368264 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" event={"ID":"6757cc2e-75a3-4e72-86a9-116cf82ee293","Type":"ContainerDied","Data":"ff82762c704ee0d6672b7257584125f1f97359cb54bd7db233dd8b4d58778082"} Dec 01 10:53:51 crc kubenswrapper[4761]: I1201 10:53:51.368423 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" event={"ID":"6757cc2e-75a3-4e72-86a9-116cf82ee293","Type":"ContainerStarted","Data":"4a721b5e9c1417f2afb59106057dbcf4370df7cc68a6c2be49dfc0a38ba2b277"} Dec 01 10:53:51 crc kubenswrapper[4761]: I1201 10:53:51.370567 4761 generic.go:334] "Generic (PLEG): container finished" podID="0db33355-a2b7-4291-85f9-2b5ad6f11d8b" containerID="6dd719fd2c08d901e32219e95784babf0c628ee0d965c75e8c54e9d06b18e929" exitCode=0 Dec 01 10:53:51 crc kubenswrapper[4761]: I1201 10:53:51.370642 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-h2nvr" event={"ID":"0db33355-a2b7-4291-85f9-2b5ad6f11d8b","Type":"ContainerDied","Data":"6dd719fd2c08d901e32219e95784babf0c628ee0d965c75e8c54e9d06b18e929"} Dec 01 10:53:51 crc kubenswrapper[4761]: I1201 10:53:51.370671 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-h2nvr" event={"ID":"0db33355-a2b7-4291-85f9-2b5ad6f11d8b","Type":"ContainerStarted","Data":"f27747cfd7307206ad3f72e5336b8fadf581c975a6cce55fa1e27c861946086b"} Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.742886 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.747627 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.822231 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tj8p\" (UniqueName: \"kubernetes.io/projected/6757cc2e-75a3-4e72-86a9-116cf82ee293-kube-api-access-4tj8p\") pod \"6757cc2e-75a3-4e72-86a9-116cf82ee293\" (UID: \"6757cc2e-75a3-4e72-86a9-116cf82ee293\") " Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.822280 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-operator-scripts\") pod \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\" (UID: \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\") " Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.822410 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f89j4\" (UniqueName: \"kubernetes.io/projected/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-kube-api-access-f89j4\") pod \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\" (UID: \"0db33355-a2b7-4291-85f9-2b5ad6f11d8b\") " Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.823059 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6757cc2e-75a3-4e72-86a9-116cf82ee293-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6757cc2e-75a3-4e72-86a9-116cf82ee293" (UID: "6757cc2e-75a3-4e72-86a9-116cf82ee293"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.823066 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0db33355-a2b7-4291-85f9-2b5ad6f11d8b" (UID: "0db33355-a2b7-4291-85f9-2b5ad6f11d8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.822511 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6757cc2e-75a3-4e72-86a9-116cf82ee293-operator-scripts\") pod \"6757cc2e-75a3-4e72-86a9-116cf82ee293\" (UID: \"6757cc2e-75a3-4e72-86a9-116cf82ee293\") " Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.823533 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.823571 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6757cc2e-75a3-4e72-86a9-116cf82ee293-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.827673 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-kube-api-access-f89j4" (OuterVolumeSpecName: "kube-api-access-f89j4") pod "0db33355-a2b7-4291-85f9-2b5ad6f11d8b" (UID: "0db33355-a2b7-4291-85f9-2b5ad6f11d8b"). InnerVolumeSpecName "kube-api-access-f89j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.828044 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6757cc2e-75a3-4e72-86a9-116cf82ee293-kube-api-access-4tj8p" (OuterVolumeSpecName: "kube-api-access-4tj8p") pod "6757cc2e-75a3-4e72-86a9-116cf82ee293" (UID: "6757cc2e-75a3-4e72-86a9-116cf82ee293"). InnerVolumeSpecName "kube-api-access-4tj8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.892390 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mtfwb"] Dec 01 10:53:52 crc kubenswrapper[4761]: E1201 10:53:52.892837 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db33355-a2b7-4291-85f9-2b5ad6f11d8b" containerName="mariadb-database-create" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.892867 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db33355-a2b7-4291-85f9-2b5ad6f11d8b" containerName="mariadb-database-create" Dec 01 10:53:52 crc kubenswrapper[4761]: E1201 10:53:52.892982 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6757cc2e-75a3-4e72-86a9-116cf82ee293" containerName="mariadb-account-create-update" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.893002 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6757cc2e-75a3-4e72-86a9-116cf82ee293" containerName="mariadb-account-create-update" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.893189 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db33355-a2b7-4291-85f9-2b5ad6f11d8b" containerName="mariadb-database-create" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.893214 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="6757cc2e-75a3-4e72-86a9-116cf82ee293" containerName="mariadb-account-create-update" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.894653 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.907473 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtfwb"] Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.925536 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-utilities\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.925686 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-catalog-content\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.925807 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jd5n\" (UniqueName: \"kubernetes.io/projected/f3325c24-a894-4ab2-a5fc-b3d40597795e-kube-api-access-6jd5n\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.925942 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tj8p\" (UniqueName: \"kubernetes.io/projected/6757cc2e-75a3-4e72-86a9-116cf82ee293-kube-api-access-4tj8p\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:52 crc kubenswrapper[4761]: I1201 10:53:52.925965 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f89j4\" (UniqueName: \"kubernetes.io/projected/0db33355-a2b7-4291-85f9-2b5ad6f11d8b-kube-api-access-f89j4\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.027849 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-utilities\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.027955 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-catalog-content\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.028032 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jd5n\" (UniqueName: \"kubernetes.io/projected/f3325c24-a894-4ab2-a5fc-b3d40597795e-kube-api-access-6jd5n\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.029167 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-utilities\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.029356 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-catalog-content\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.046176 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jd5n\" (UniqueName: \"kubernetes.io/projected/f3325c24-a894-4ab2-a5fc-b3d40597795e-kube-api-access-6jd5n\") pod \"redhat-marketplace-mtfwb\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.236295 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.389231 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-h2nvr" event={"ID":"0db33355-a2b7-4291-85f9-2b5ad6f11d8b","Type":"ContainerDied","Data":"f27747cfd7307206ad3f72e5336b8fadf581c975a6cce55fa1e27c861946086b"} Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.389449 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27747cfd7307206ad3f72e5336b8fadf581c975a6cce55fa1e27c861946086b" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.389471 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-h2nvr" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.391248 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" event={"ID":"6757cc2e-75a3-4e72-86a9-116cf82ee293","Type":"ContainerDied","Data":"4a721b5e9c1417f2afb59106057dbcf4370df7cc68a6c2be49dfc0a38ba2b277"} Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.391266 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a721b5e9c1417f2afb59106057dbcf4370df7cc68a6c2be49dfc0a38ba2b277" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.391297 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5333-account-create-update-xkdj2" Dec 01 10:53:53 crc kubenswrapper[4761]: I1201 10:53:53.722268 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtfwb"] Dec 01 10:53:54 crc kubenswrapper[4761]: I1201 10:53:54.405962 4761 generic.go:334] "Generic (PLEG): container finished" podID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerID="298463f7284ecbbd87d05d248b85b7c460ec92ad5a35d1685a45d10f3514fbc3" exitCode=0 Dec 01 10:53:54 crc kubenswrapper[4761]: I1201 10:53:54.406091 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtfwb" event={"ID":"f3325c24-a894-4ab2-a5fc-b3d40597795e","Type":"ContainerDied","Data":"298463f7284ecbbd87d05d248b85b7c460ec92ad5a35d1685a45d10f3514fbc3"} Dec 01 10:53:54 crc kubenswrapper[4761]: I1201 10:53:54.406456 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtfwb" event={"ID":"f3325c24-a894-4ab2-a5fc-b3d40597795e","Type":"ContainerStarted","Data":"efc6bee9b3506058d3cc02f8a04a0c02d36c8f10785291d3aff76a561dc77213"} Dec 01 10:53:54 crc kubenswrapper[4761]: I1201 10:53:54.408903 4761 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.368097 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-hjf46"] Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.369234 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.373003 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-nn2mm" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.373247 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.377740 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-hjf46"] Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.463745 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-config-data\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.463815 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbwcw\" (UniqueName: \"kubernetes.io/projected/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-kube-api-access-lbwcw\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.463867 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-db-sync-config-data\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.565147 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbwcw\" (UniqueName: \"kubernetes.io/projected/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-kube-api-access-lbwcw\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.565227 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-db-sync-config-data\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.565291 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-config-data\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.570296 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-db-sync-config-data\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.573382 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-config-data\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.595355 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbwcw\" (UniqueName: \"kubernetes.io/projected/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-kube-api-access-lbwcw\") pod \"glance-db-sync-hjf46\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.686162 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:53:55 crc kubenswrapper[4761]: I1201 10:53:55.921341 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-hjf46"] Dec 01 10:53:55 crc kubenswrapper[4761]: W1201 10:53:55.922672 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e8b1045_72b1_4dd5_a7b0_f58713505cb7.slice/crio-16ba643ab41931d6ec9d03ff2ab731ef625180ccdb1f31e1484fdfdd4c829dc6 WatchSource:0}: Error finding container 16ba643ab41931d6ec9d03ff2ab731ef625180ccdb1f31e1484fdfdd4c829dc6: Status 404 returned error can't find the container with id 16ba643ab41931d6ec9d03ff2ab731ef625180ccdb1f31e1484fdfdd4c829dc6 Dec 01 10:53:56 crc kubenswrapper[4761]: I1201 10:53:56.424299 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-hjf46" event={"ID":"0e8b1045-72b1-4dd5-a7b0-f58713505cb7","Type":"ContainerStarted","Data":"16ba643ab41931d6ec9d03ff2ab731ef625180ccdb1f31e1484fdfdd4c829dc6"} Dec 01 10:53:56 crc kubenswrapper[4761]: I1201 10:53:56.426729 4761 generic.go:334] "Generic (PLEG): container finished" podID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerID="94b6669b96fcb367e094aa97d1fda15cdcebfeb3ab3c0e5a5cea4841f1caab42" exitCode=0 Dec 01 10:53:56 crc kubenswrapper[4761]: I1201 10:53:56.426790 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtfwb" event={"ID":"f3325c24-a894-4ab2-a5fc-b3d40597795e","Type":"ContainerDied","Data":"94b6669b96fcb367e094aa97d1fda15cdcebfeb3ab3c0e5a5cea4841f1caab42"} Dec 01 10:53:57 crc kubenswrapper[4761]: I1201 10:53:57.435213 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-hjf46" event={"ID":"0e8b1045-72b1-4dd5-a7b0-f58713505cb7","Type":"ContainerStarted","Data":"fec549f18510f9be0081d189d5b22bd5048deb784449d03335aa7f9f5e29a4e1"} Dec 01 10:53:57 crc kubenswrapper[4761]: I1201 10:53:57.442338 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtfwb" event={"ID":"f3325c24-a894-4ab2-a5fc-b3d40597795e","Type":"ContainerStarted","Data":"b29b1ba9cdfdbed40a5f723f8439f38f30f309041fbcbc97d4d59c86e442f767"} Dec 01 10:53:57 crc kubenswrapper[4761]: I1201 10:53:57.459654 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-hjf46" podStartSLOduration=2.459636765 podStartE2EDuration="2.459636765s" podCreationTimestamp="2025-12-01 10:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:53:57.453535503 +0000 UTC m=+1376.757294127" watchObservedRunningTime="2025-12-01 10:53:57.459636765 +0000 UTC m=+1376.763395389" Dec 01 10:53:57 crc kubenswrapper[4761]: I1201 10:53:57.487202 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mtfwb" podStartSLOduration=2.721314634 podStartE2EDuration="5.487184066s" podCreationTimestamp="2025-12-01 10:53:52 +0000 UTC" firstStartedPulling="2025-12-01 10:53:54.408602575 +0000 UTC m=+1373.712361209" lastFinishedPulling="2025-12-01 10:53:57.174472017 +0000 UTC m=+1376.478230641" observedRunningTime="2025-12-01 10:53:57.483297513 +0000 UTC m=+1376.787056137" watchObservedRunningTime="2025-12-01 10:53:57.487184066 +0000 UTC m=+1376.790942700" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.088172 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jf2tz"] Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.090375 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.103754 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jf2tz"] Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.118217 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3393fe92-7c80-4229-bc37-a12db29df394-catalog-content\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.118560 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3393fe92-7c80-4229-bc37-a12db29df394-utilities\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.118626 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-229nn\" (UniqueName: \"kubernetes.io/projected/3393fe92-7c80-4229-bc37-a12db29df394-kube-api-access-229nn\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.219764 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3393fe92-7c80-4229-bc37-a12db29df394-catalog-content\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.219901 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3393fe92-7c80-4229-bc37-a12db29df394-utilities\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.219928 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-229nn\" (UniqueName: \"kubernetes.io/projected/3393fe92-7c80-4229-bc37-a12db29df394-kube-api-access-229nn\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.220454 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3393fe92-7c80-4229-bc37-a12db29df394-utilities\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.220592 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3393fe92-7c80-4229-bc37-a12db29df394-catalog-content\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.244649 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-229nn\" (UniqueName: \"kubernetes.io/projected/3393fe92-7c80-4229-bc37-a12db29df394-kube-api-access-229nn\") pod \"redhat-operators-jf2tz\" (UID: \"3393fe92-7c80-4229-bc37-a12db29df394\") " pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.428682 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:53:58 crc kubenswrapper[4761]: I1201 10:53:58.899117 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jf2tz"] Dec 01 10:53:58 crc kubenswrapper[4761]: W1201 10:53:58.907690 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3393fe92_7c80_4229_bc37_a12db29df394.slice/crio-5404e88c97ef28824cc911ae11c79eb8aa00c318d8a2989870558e35d4c29482 WatchSource:0}: Error finding container 5404e88c97ef28824cc911ae11c79eb8aa00c318d8a2989870558e35d4c29482: Status 404 returned error can't find the container with id 5404e88c97ef28824cc911ae11c79eb8aa00c318d8a2989870558e35d4c29482 Dec 01 10:53:59 crc kubenswrapper[4761]: I1201 10:53:59.456684 4761 generic.go:334] "Generic (PLEG): container finished" podID="3393fe92-7c80-4229-bc37-a12db29df394" containerID="da62643830a119c41da2a64bdf4c19d129b397ea4deb8995e0b2c1f2b4824532" exitCode=0 Dec 01 10:53:59 crc kubenswrapper[4761]: I1201 10:53:59.456746 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf2tz" event={"ID":"3393fe92-7c80-4229-bc37-a12db29df394","Type":"ContainerDied","Data":"da62643830a119c41da2a64bdf4c19d129b397ea4deb8995e0b2c1f2b4824532"} Dec 01 10:53:59 crc kubenswrapper[4761]: I1201 10:53:59.456969 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf2tz" event={"ID":"3393fe92-7c80-4229-bc37-a12db29df394","Type":"ContainerStarted","Data":"5404e88c97ef28824cc911ae11c79eb8aa00c318d8a2989870558e35d4c29482"} Dec 01 10:54:00 crc kubenswrapper[4761]: I1201 10:54:00.466899 4761 generic.go:334] "Generic (PLEG): container finished" podID="0e8b1045-72b1-4dd5-a7b0-f58713505cb7" containerID="fec549f18510f9be0081d189d5b22bd5048deb784449d03335aa7f9f5e29a4e1" exitCode=0 Dec 01 10:54:00 crc kubenswrapper[4761]: I1201 10:54:00.466952 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-hjf46" event={"ID":"0e8b1045-72b1-4dd5-a7b0-f58713505cb7","Type":"ContainerDied","Data":"fec549f18510f9be0081d189d5b22bd5048deb784449d03335aa7f9f5e29a4e1"} Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.761654 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.881324 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbwcw\" (UniqueName: \"kubernetes.io/projected/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-kube-api-access-lbwcw\") pod \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.881363 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-config-data\") pod \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.882196 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-db-sync-config-data\") pod \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\" (UID: \"0e8b1045-72b1-4dd5-a7b0-f58713505cb7\") " Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.886903 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0e8b1045-72b1-4dd5-a7b0-f58713505cb7" (UID: "0e8b1045-72b1-4dd5-a7b0-f58713505cb7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.887419 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-kube-api-access-lbwcw" (OuterVolumeSpecName: "kube-api-access-lbwcw") pod "0e8b1045-72b1-4dd5-a7b0-f58713505cb7" (UID: "0e8b1045-72b1-4dd5-a7b0-f58713505cb7"). InnerVolumeSpecName "kube-api-access-lbwcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.951747 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-config-data" (OuterVolumeSpecName: "config-data") pod "0e8b1045-72b1-4dd5-a7b0-f58713505cb7" (UID: "0e8b1045-72b1-4dd5-a7b0-f58713505cb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.983650 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.983685 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbwcw\" (UniqueName: \"kubernetes.io/projected/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-kube-api-access-lbwcw\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:01 crc kubenswrapper[4761]: I1201 10:54:01.983695 4761 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e8b1045-72b1-4dd5-a7b0-f58713505cb7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:02 crc kubenswrapper[4761]: I1201 10:54:02.482581 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-hjf46" event={"ID":"0e8b1045-72b1-4dd5-a7b0-f58713505cb7","Type":"ContainerDied","Data":"16ba643ab41931d6ec9d03ff2ab731ef625180ccdb1f31e1484fdfdd4c829dc6"} Dec 01 10:54:02 crc kubenswrapper[4761]: I1201 10:54:02.482635 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16ba643ab41931d6ec9d03ff2ab731ef625180ccdb1f31e1484fdfdd4c829dc6" Dec 01 10:54:02 crc kubenswrapper[4761]: I1201 10:54:02.482660 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-hjf46" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.237344 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.237383 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.324485 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.539933 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.599230 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtfwb"] Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.827834 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:03 crc kubenswrapper[4761]: E1201 10:54:03.828416 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8b1045-72b1-4dd5-a7b0-f58713505cb7" containerName="glance-db-sync" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.828438 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8b1045-72b1-4dd5-a7b0-f58713505cb7" containerName="glance-db-sync" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.828575 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8b1045-72b1-4dd5-a7b0-f58713505cb7" containerName="glance-db-sync" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.829231 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.832018 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.832293 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.832431 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-nn2mm" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.836542 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.908836 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-httpd-run\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.908882 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.908901 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.908916 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-config-data\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.908949 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-run\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.908971 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbnzp\" (UniqueName: \"kubernetes.io/projected/1ed22625-8fda-492f-be20-42a940f00a15-kube-api-access-tbnzp\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.908992 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-scripts\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.909014 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-lib-modules\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.909030 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-nvme\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.909059 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.909081 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-sys\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.909110 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-dev\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.909144 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.909158 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-logs\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:03 crc kubenswrapper[4761]: I1201 10:54:03.916569 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:03 crc kubenswrapper[4761]: E1201 10:54:03.917151 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data dev etc-iscsi etc-nvme glance glance-cache httpd-run kube-api-access-tbnzp lib-modules logs run scripts sys var-locks-brick], unattached volumes=[], failed to process volumes=[]: context canceled" pod="glance-kuttl-tests/glance-default-single-0" podUID="1ed22625-8fda-492f-be20-42a940f00a15" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010102 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-sys\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010164 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-dev\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010192 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010208 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-logs\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010225 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-httpd-run\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010249 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010265 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010282 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-config-data\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010321 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-run\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010341 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbnzp\" (UniqueName: \"kubernetes.io/projected/1ed22625-8fda-492f-be20-42a940f00a15-kube-api-access-tbnzp\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010360 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-scripts\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010386 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-lib-modules\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010401 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-nvme\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010427 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010491 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010527 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-sys\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010562 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-dev\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010617 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.010970 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-logs\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.011184 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-httpd-run\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.011508 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.011737 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-run\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.011897 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.012001 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-lib-modules\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.012157 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-nvme\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.016771 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-scripts\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.018139 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-config-data\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.033767 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.046907 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.049618 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbnzp\" (UniqueName: \"kubernetes.io/projected/1ed22625-8fda-492f-be20-42a940f00a15-kube-api-access-tbnzp\") pod \"glance-default-single-0\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.497009 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.507605 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517368 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-run\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517401 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517424 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-nvme\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517443 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517478 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-lib-modules\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517526 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-var-locks-brick\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517595 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-scripts\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517615 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-dev\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517636 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-logs\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517671 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-iscsi\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517696 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-httpd-run\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517723 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbnzp\" (UniqueName: \"kubernetes.io/projected/1ed22625-8fda-492f-be20-42a940f00a15-kube-api-access-tbnzp\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517749 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-config-data\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.517770 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-sys\") pod \"1ed22625-8fda-492f-be20-42a940f00a15\" (UID: \"1ed22625-8fda-492f-be20-42a940f00a15\") " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.518301 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-run" (OuterVolumeSpecName: "run") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.518608 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-dev" (OuterVolumeSpecName: "dev") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.518638 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.518865 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.518918 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.518985 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-sys" (OuterVolumeSpecName: "sys") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.519009 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.519017 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.519230 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-logs" (OuterVolumeSpecName: "logs") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.521681 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-config-data" (OuterVolumeSpecName: "config-data") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.521864 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-scripts" (OuterVolumeSpecName: "scripts") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.522245 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.523925 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed22625-8fda-492f-be20-42a940f00a15-kube-api-access-tbnzp" (OuterVolumeSpecName: "kube-api-access-tbnzp") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "kube-api-access-tbnzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.531844 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "1ed22625-8fda-492f-be20-42a940f00a15" (UID: "1ed22625-8fda-492f-be20-42a940f00a15"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619625 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619658 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619669 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed22625-8fda-492f-be20-42a940f00a15-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619678 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbnzp\" (UniqueName: \"kubernetes.io/projected/1ed22625-8fda-492f-be20-42a940f00a15-kube-api-access-tbnzp\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619689 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619697 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619705 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619773 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619787 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619801 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619809 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619818 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619827 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed22625-8fda-492f-be20-42a940f00a15-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.619836 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1ed22625-8fda-492f-be20-42a940f00a15-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.637529 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.637873 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.721660 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:04 crc kubenswrapper[4761]: I1201 10:54:04.721701 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.503826 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.504764 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mtfwb" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerName="registry-server" containerID="cri-o://b29b1ba9cdfdbed40a5f723f8439f38f30f309041fbcbc97d4d59c86e442f767" gracePeriod=2 Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.563398 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.573730 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.580887 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.582097 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.587160 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.587411 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.588331 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.589057 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-nn2mm" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.635165 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-lib-modules\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.635386 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpb5\" (UniqueName: \"kubernetes.io/projected/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-kube-api-access-xhpb5\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.635471 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-config-data\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.635598 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-sys\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.635774 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.635862 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-dev\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.635930 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.636000 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-logs\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.636066 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.636134 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-run\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.636205 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-scripts\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.636279 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-nvme\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.636344 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.636427 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-httpd-run\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.737800 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-sys\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.737879 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.737907 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-dev\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.737929 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.737943 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-logs\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.737945 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-sys\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.737960 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738033 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-run\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738078 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-scripts\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738119 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-nvme\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738141 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738165 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-httpd-run\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738191 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-lib-modules\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738213 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpb5\" (UniqueName: \"kubernetes.io/projected/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-kube-api-access-xhpb5\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738227 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738237 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-config-data\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738653 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738680 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-dev\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738739 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738832 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738834 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-run\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738954 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-nvme\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.738957 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-lib-modules\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.739199 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-httpd-run\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.739239 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-logs\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.749356 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-scripts\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.755494 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-config-data\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.762591 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpb5\" (UniqueName: \"kubernetes.io/projected/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-kube-api-access-xhpb5\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.764139 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.779880 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-0\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:05 crc kubenswrapper[4761]: I1201 10:54:05.917082 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:06 crc kubenswrapper[4761]: I1201 10:54:06.515078 4761 generic.go:334] "Generic (PLEG): container finished" podID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerID="b29b1ba9cdfdbed40a5f723f8439f38f30f309041fbcbc97d4d59c86e442f767" exitCode=0 Dec 01 10:54:06 crc kubenswrapper[4761]: I1201 10:54:06.515318 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtfwb" event={"ID":"f3325c24-a894-4ab2-a5fc-b3d40597795e","Type":"ContainerDied","Data":"b29b1ba9cdfdbed40a5f723f8439f38f30f309041fbcbc97d4d59c86e442f767"} Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.136599 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed22625-8fda-492f-be20-42a940f00a15" path="/var/lib/kubelet/pods/1ed22625-8fda-492f-be20-42a940f00a15/volumes" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.431180 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.527735 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf2tz" event={"ID":"3393fe92-7c80-4229-bc37-a12db29df394","Type":"ContainerStarted","Data":"67a8f53435a66689a1bb78b53cdaa847637fc280a2d9350147fa83429fcd410a"} Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.530667 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtfwb" event={"ID":"f3325c24-a894-4ab2-a5fc-b3d40597795e","Type":"ContainerDied","Data":"efc6bee9b3506058d3cc02f8a04a0c02d36c8f10785291d3aff76a561dc77213"} Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.530718 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtfwb" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.530734 4761 scope.go:117] "RemoveContainer" containerID="b29b1ba9cdfdbed40a5f723f8439f38f30f309041fbcbc97d4d59c86e442f767" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.556051 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.558981 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-utilities\") pod \"f3325c24-a894-4ab2-a5fc-b3d40597795e\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.559039 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jd5n\" (UniqueName: \"kubernetes.io/projected/f3325c24-a894-4ab2-a5fc-b3d40597795e-kube-api-access-6jd5n\") pod \"f3325c24-a894-4ab2-a5fc-b3d40597795e\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.559133 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-catalog-content\") pod \"f3325c24-a894-4ab2-a5fc-b3d40597795e\" (UID: \"f3325c24-a894-4ab2-a5fc-b3d40597795e\") " Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.559865 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-utilities" (OuterVolumeSpecName: "utilities") pod "f3325c24-a894-4ab2-a5fc-b3d40597795e" (UID: "f3325c24-a894-4ab2-a5fc-b3d40597795e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:07 crc kubenswrapper[4761]: W1201 10:54:07.565541 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77fde0e2_cc4a_4dd9_9da9_b2f8ffdce5c9.slice/crio-f0ebbe5e65d5b9ff01ae86c34f35ea70a3e4568dbcf6e13b0c9b9fcc5f849938 WatchSource:0}: Error finding container f0ebbe5e65d5b9ff01ae86c34f35ea70a3e4568dbcf6e13b0c9b9fcc5f849938: Status 404 returned error can't find the container with id f0ebbe5e65d5b9ff01ae86c34f35ea70a3e4568dbcf6e13b0c9b9fcc5f849938 Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.568330 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3325c24-a894-4ab2-a5fc-b3d40597795e-kube-api-access-6jd5n" (OuterVolumeSpecName: "kube-api-access-6jd5n") pod "f3325c24-a894-4ab2-a5fc-b3d40597795e" (UID: "f3325c24-a894-4ab2-a5fc-b3d40597795e"). InnerVolumeSpecName "kube-api-access-6jd5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.581964 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3325c24-a894-4ab2-a5fc-b3d40597795e" (UID: "f3325c24-a894-4ab2-a5fc-b3d40597795e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.601095 4761 scope.go:117] "RemoveContainer" containerID="94b6669b96fcb367e094aa97d1fda15cdcebfeb3ab3c0e5a5cea4841f1caab42" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.621158 4761 scope.go:117] "RemoveContainer" containerID="298463f7284ecbbd87d05d248b85b7c460ec92ad5a35d1685a45d10f3514fbc3" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.661297 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.661329 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3325c24-a894-4ab2-a5fc-b3d40597795e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.661342 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jd5n\" (UniqueName: \"kubernetes.io/projected/f3325c24-a894-4ab2-a5fc-b3d40597795e-kube-api-access-6jd5n\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.892727 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtfwb"] Dec 01 10:54:07 crc kubenswrapper[4761]: I1201 10:54:07.898948 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtfwb"] Dec 01 10:54:08 crc kubenswrapper[4761]: I1201 10:54:08.542456 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9","Type":"ContainerStarted","Data":"6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94"} Dec 01 10:54:08 crc kubenswrapper[4761]: I1201 10:54:08.542846 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9","Type":"ContainerStarted","Data":"d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1"} Dec 01 10:54:08 crc kubenswrapper[4761]: I1201 10:54:08.542862 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9","Type":"ContainerStarted","Data":"f0ebbe5e65d5b9ff01ae86c34f35ea70a3e4568dbcf6e13b0c9b9fcc5f849938"} Dec 01 10:54:08 crc kubenswrapper[4761]: I1201 10:54:08.546760 4761 generic.go:334] "Generic (PLEG): container finished" podID="3393fe92-7c80-4229-bc37-a12db29df394" containerID="67a8f53435a66689a1bb78b53cdaa847637fc280a2d9350147fa83429fcd410a" exitCode=0 Dec 01 10:54:08 crc kubenswrapper[4761]: I1201 10:54:08.546833 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf2tz" event={"ID":"3393fe92-7c80-4229-bc37-a12db29df394","Type":"ContainerDied","Data":"67a8f53435a66689a1bb78b53cdaa847637fc280a2d9350147fa83429fcd410a"} Dec 01 10:54:08 crc kubenswrapper[4761]: I1201 10:54:08.621757 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.621732671 podStartE2EDuration="3.621732671s" podCreationTimestamp="2025-12-01 10:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:54:08.585177051 +0000 UTC m=+1387.888935725" watchObservedRunningTime="2025-12-01 10:54:08.621732671 +0000 UTC m=+1387.925491335" Dec 01 10:54:09 crc kubenswrapper[4761]: I1201 10:54:09.149368 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" path="/var/lib/kubelet/pods/f3325c24-a894-4ab2-a5fc-b3d40597795e/volumes" Dec 01 10:54:09 crc kubenswrapper[4761]: I1201 10:54:09.557887 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf2tz" event={"ID":"3393fe92-7c80-4229-bc37-a12db29df394","Type":"ContainerStarted","Data":"068459f5925c300ccd4f7d160fa7f96696d887d81901bd628dea0939b15bfb5e"} Dec 01 10:54:09 crc kubenswrapper[4761]: I1201 10:54:09.573959 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jf2tz" podStartSLOduration=1.66532171 podStartE2EDuration="11.57391834s" podCreationTimestamp="2025-12-01 10:53:58 +0000 UTC" firstStartedPulling="2025-12-01 10:53:59.458145502 +0000 UTC m=+1378.761904126" lastFinishedPulling="2025-12-01 10:54:09.366742102 +0000 UTC m=+1388.670500756" observedRunningTime="2025-12-01 10:54:09.572993885 +0000 UTC m=+1388.876752529" watchObservedRunningTime="2025-12-01 10:54:09.57391834 +0000 UTC m=+1388.877676974" Dec 01 10:54:15 crc kubenswrapper[4761]: I1201 10:54:15.918202 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:15 crc kubenswrapper[4761]: I1201 10:54:15.918898 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:15 crc kubenswrapper[4761]: I1201 10:54:15.953209 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:15 crc kubenswrapper[4761]: I1201 10:54:15.965918 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:16 crc kubenswrapper[4761]: I1201 10:54:16.611380 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:16 crc kubenswrapper[4761]: I1201 10:54:16.611460 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.429582 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.429944 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.517155 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.537611 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.627498 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.688472 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.694840 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jf2tz" Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.793747 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jf2tz"] Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.822614 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5vnxn"] Dec 01 10:54:18 crc kubenswrapper[4761]: I1201 10:54:18.822891 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5vnxn" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="registry-server" containerID="cri-o://d318677f31f66e22c59e62bda306a046e2af3e5d3fb1833607b0887536ae339e" gracePeriod=2 Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.645420 4761 generic.go:334] "Generic (PLEG): container finished" podID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerID="d318677f31f66e22c59e62bda306a046e2af3e5d3fb1833607b0887536ae339e" exitCode=0 Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.645492 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vnxn" event={"ID":"1396c61b-86e8-41e3-90d3-76c88b8c7994","Type":"ContainerDied","Data":"d318677f31f66e22c59e62bda306a046e2af3e5d3fb1833607b0887536ae339e"} Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.772510 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:54:20 crc kubenswrapper[4761]: E1201 10:54:20.773355 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerName="registry-server" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.773384 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerName="registry-server" Dec 01 10:54:20 crc kubenswrapper[4761]: E1201 10:54:20.773412 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerName="extract-content" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.773422 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerName="extract-content" Dec 01 10:54:20 crc kubenswrapper[4761]: E1201 10:54:20.773438 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerName="extract-utilities" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.773448 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerName="extract-utilities" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.773700 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3325c24-a894-4ab2-a5fc-b3d40597795e" containerName="registry-server" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.775022 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.795172 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.796740 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.803702 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.810714 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885715 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-nvme\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885768 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-scripts\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885789 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrxt\" (UniqueName: \"kubernetes.io/projected/33554e0e-766e-49ba-a811-83d941577557-kube-api-access-jtrxt\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885807 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-run\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885829 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-sys\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885854 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztwxj\" (UniqueName: \"kubernetes.io/projected/fc408da7-9bc6-4d4f-b288-279a86149a82-kube-api-access-ztwxj\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885870 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885885 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885900 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-lib-modules\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885918 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-config-data\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885941 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885958 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-run\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885978 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.885999 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-dev\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886022 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-scripts\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886041 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-config-data\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886054 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-sys\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886074 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-nvme\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886089 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886104 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886125 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-httpd-run\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886140 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-logs\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886159 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886175 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886190 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-logs\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886211 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-lib-modules\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886233 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-dev\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.886248 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-httpd-run\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.987892 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztwxj\" (UniqueName: \"kubernetes.io/projected/fc408da7-9bc6-4d4f-b288-279a86149a82-kube-api-access-ztwxj\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.987928 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.987944 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.987959 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-lib-modules\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.987978 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-config-data\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988000 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988018 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-run\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988034 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988048 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-dev\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988067 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-scripts\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988083 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-config-data\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988097 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-sys\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988117 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-nvme\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988132 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988150 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988169 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-logs\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988184 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-httpd-run\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988204 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988220 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988235 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-logs\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988255 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-lib-modules\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988277 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-dev\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988296 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-httpd-run\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988318 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-nvme\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988333 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-scripts\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988349 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrxt\" (UniqueName: \"kubernetes.io/projected/33554e0e-766e-49ba-a811-83d941577557-kube-api-access-jtrxt\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988364 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-run\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988380 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-sys\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988442 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-sys\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.988943 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.991668 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.991971 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.992810 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-dev\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.992857 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-lib-modules\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.993649 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-lib-modules\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.993666 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-run\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.993675 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.993742 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.993773 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-nvme\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.993909 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-dev\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.993981 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-sys\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.994119 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.994569 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-run\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.994604 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-logs\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:20 crc kubenswrapper[4761]: I1201 10:54:20.994678 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:20.995028 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-nvme\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:20.995087 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-logs\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:20.995093 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:20.995402 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-httpd-run\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:20.996960 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-httpd-run\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.005333 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-config-data\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.011030 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-scripts\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.013667 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-scripts\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.016094 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-config-data\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.018319 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrxt\" (UniqueName: \"kubernetes.io/projected/33554e0e-766e-49ba-a811-83d941577557-kube-api-access-jtrxt\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.031781 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.032991 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.037162 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztwxj\" (UniqueName: \"kubernetes.io/projected/fc408da7-9bc6-4d4f-b288-279a86149a82-kube-api-access-ztwxj\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.042281 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.057760 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-2\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.096384 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.126522 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.186780 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.190243 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-utilities\") pod \"1396c61b-86e8-41e3-90d3-76c88b8c7994\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.190305 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-catalog-content\") pod \"1396c61b-86e8-41e3-90d3-76c88b8c7994\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.190362 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7spmj\" (UniqueName: \"kubernetes.io/projected/1396c61b-86e8-41e3-90d3-76c88b8c7994-kube-api-access-7spmj\") pod \"1396c61b-86e8-41e3-90d3-76c88b8c7994\" (UID: \"1396c61b-86e8-41e3-90d3-76c88b8c7994\") " Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.197688 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-utilities" (OuterVolumeSpecName: "utilities") pod "1396c61b-86e8-41e3-90d3-76c88b8c7994" (UID: "1396c61b-86e8-41e3-90d3-76c88b8c7994"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.201209 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1396c61b-86e8-41e3-90d3-76c88b8c7994-kube-api-access-7spmj" (OuterVolumeSpecName: "kube-api-access-7spmj") pod "1396c61b-86e8-41e3-90d3-76c88b8c7994" (UID: "1396c61b-86e8-41e3-90d3-76c88b8c7994"). InnerVolumeSpecName "kube-api-access-7spmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.293167 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7spmj\" (UniqueName: \"kubernetes.io/projected/1396c61b-86e8-41e3-90d3-76c88b8c7994-kube-api-access-7spmj\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.293231 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.303744 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1396c61b-86e8-41e3-90d3-76c88b8c7994" (UID: "1396c61b-86e8-41e3-90d3-76c88b8c7994"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.395052 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1396c61b-86e8-41e3-90d3-76c88b8c7994-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.434919 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.653288 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"33554e0e-766e-49ba-a811-83d941577557","Type":"ContainerStarted","Data":"7fc8a0a400d7161ead1af25f1c511e3ec4bb77b5832ce79654441e33e8226bed"} Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.655930 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vnxn" event={"ID":"1396c61b-86e8-41e3-90d3-76c88b8c7994","Type":"ContainerDied","Data":"71209cbc57e363e89f17f72ddf7531c295813ee2eb643fdc0e3b34423a9ab277"} Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.655997 4761 scope.go:117] "RemoveContainer" containerID="d318677f31f66e22c59e62bda306a046e2af3e5d3fb1833607b0887536ae339e" Dec 01 10:54:21 crc kubenswrapper[4761]: I1201 10:54:21.656003 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vnxn" Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.261335 4761 scope.go:117] "RemoveContainer" containerID="ea75f85d815a0374a9115194268c0ee30a88a20fafcb63d33532ab2641d976e8" Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.300619 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.352641 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5vnxn"] Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.354938 4761 scope.go:117] "RemoveContainer" containerID="076f9017946141c281694a50fabe1c861187e3fb7ef656149f8c412d1f88dedb" Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.366008 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5vnxn"] Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.664918 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"33554e0e-766e-49ba-a811-83d941577557","Type":"ContainerStarted","Data":"dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880"} Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.665261 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"33554e0e-766e-49ba-a811-83d941577557","Type":"ContainerStarted","Data":"85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a"} Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.668624 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"fc408da7-9bc6-4d4f-b288-279a86149a82","Type":"ContainerStarted","Data":"da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b"} Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.668669 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"fc408da7-9bc6-4d4f-b288-279a86149a82","Type":"ContainerStarted","Data":"5c642155a422e855111aa99bf3f169e1c29c44f85f6da4eaefcf1f378c955ab0"} Dec 01 10:54:22 crc kubenswrapper[4761]: I1201 10:54:22.692092 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=3.692073417 podStartE2EDuration="3.692073417s" podCreationTimestamp="2025-12-01 10:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:54:22.690252649 +0000 UTC m=+1401.994011353" watchObservedRunningTime="2025-12-01 10:54:22.692073417 +0000 UTC m=+1401.995832051" Dec 01 10:54:23 crc kubenswrapper[4761]: I1201 10:54:23.138122 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" path="/var/lib/kubelet/pods/1396c61b-86e8-41e3-90d3-76c88b8c7994/volumes" Dec 01 10:54:23 crc kubenswrapper[4761]: I1201 10:54:23.690426 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"fc408da7-9bc6-4d4f-b288-279a86149a82","Type":"ContainerStarted","Data":"f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357"} Dec 01 10:54:23 crc kubenswrapper[4761]: I1201 10:54:23.722949 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-2" podStartSLOduration=4.722929424 podStartE2EDuration="4.722929424s" podCreationTimestamp="2025-12-01 10:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:54:23.711428019 +0000 UTC m=+1403.015186643" watchObservedRunningTime="2025-12-01 10:54:23.722929424 +0000 UTC m=+1403.026688048" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.097623 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.098433 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.127260 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.141080 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.161848 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.192032 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.194182 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.200296 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.775843 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.776117 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.776128 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:31 crc kubenswrapper[4761]: I1201 10:54:31.776136 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:33 crc kubenswrapper[4761]: I1201 10:54:33.705607 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:33 crc kubenswrapper[4761]: I1201 10:54:33.716143 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:33 crc kubenswrapper[4761]: I1201 10:54:33.726443 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:33 crc kubenswrapper[4761]: I1201 10:54:33.768994 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:33 crc kubenswrapper[4761]: I1201 10:54:33.852117 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:54:33 crc kubenswrapper[4761]: I1201 10:54:33.852165 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:54:34 crc kubenswrapper[4761]: I1201 10:54:34.617113 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 01 10:54:34 crc kubenswrapper[4761]: I1201 10:54:34.626863 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:54:35 crc kubenswrapper[4761]: I1201 10:54:35.806955 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-log" containerID="cri-o://da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b" gracePeriod=30 Dec 01 10:54:35 crc kubenswrapper[4761]: I1201 10:54:35.807006 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-httpd" containerID="cri-o://f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357" gracePeriod=30 Dec 01 10:54:35 crc kubenswrapper[4761]: I1201 10:54:35.807132 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="33554e0e-766e-49ba-a811-83d941577557" containerName="glance-log" containerID="cri-o://85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a" gracePeriod=30 Dec 01 10:54:35 crc kubenswrapper[4761]: I1201 10:54:35.807214 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="33554e0e-766e-49ba-a811-83d941577557" containerName="glance-httpd" containerID="cri-o://dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880" gracePeriod=30 Dec 01 10:54:35 crc kubenswrapper[4761]: I1201 10:54:35.813281 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-2" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.141:9292/healthcheck\": EOF" Dec 01 10:54:36 crc kubenswrapper[4761]: I1201 10:54:36.815180 4761 generic.go:334] "Generic (PLEG): container finished" podID="33554e0e-766e-49ba-a811-83d941577557" containerID="85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a" exitCode=143 Dec 01 10:54:36 crc kubenswrapper[4761]: I1201 10:54:36.815301 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"33554e0e-766e-49ba-a811-83d941577557","Type":"ContainerDied","Data":"85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a"} Dec 01 10:54:36 crc kubenswrapper[4761]: I1201 10:54:36.817268 4761 generic.go:334] "Generic (PLEG): container finished" podID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerID="da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b" exitCode=143 Dec 01 10:54:36 crc kubenswrapper[4761]: I1201 10:54:36.817306 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"fc408da7-9bc6-4d4f-b288-279a86149a82","Type":"ContainerDied","Data":"da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b"} Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.445040 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.450583 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.554508 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-dev\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.554844 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-logs\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.554863 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-dev\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.554884 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztwxj\" (UniqueName: \"kubernetes.io/projected/fc408da7-9bc6-4d4f-b288-279a86149a82-kube-api-access-ztwxj\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.554909 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.554647 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-dev" (OuterVolumeSpecName: "dev") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.554936 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-run\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.554986 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-run" (OuterVolumeSpecName: "run") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555011 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-lib-modules\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555057 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-httpd-run\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555085 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-nvme\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555083 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-dev" (OuterVolumeSpecName: "dev") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555110 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555147 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-lib-modules\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555173 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtrxt\" (UniqueName: \"kubernetes.io/projected/33554e0e-766e-49ba-a811-83d941577557-kube-api-access-jtrxt\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555207 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-scripts\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555250 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-iscsi\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555272 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-config-data\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555352 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-logs\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555376 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-run\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555398 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-iscsi\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555403 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555422 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-sys\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555442 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555468 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-nvme\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555502 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-scripts\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555527 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-logs" (OuterVolumeSpecName: "logs") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555619 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-run" (OuterVolumeSpecName: "run") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555645 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-sys" (OuterVolumeSpecName: "sys") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555616 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-sys\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555706 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-var-locks-brick\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555728 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555757 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-config-data\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555781 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-httpd-run\") pod \"fc408da7-9bc6-4d4f-b288-279a86149a82\" (UID: \"fc408da7-9bc6-4d4f-b288-279a86149a82\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555812 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.555840 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-var-locks-brick\") pod \"33554e0e-766e-49ba-a811-83d941577557\" (UID: \"33554e0e-766e-49ba-a811-83d941577557\") " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556117 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-logs" (OuterVolumeSpecName: "logs") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556153 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556181 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556201 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-sys" (OuterVolumeSpecName: "sys") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556239 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556475 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556490 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556503 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556513 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556525 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556536 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556566 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556580 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556590 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556600 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33554e0e-766e-49ba-a811-83d941577557-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556611 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556621 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.556634 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.557581 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.557982 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.560637 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.560668 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.560675 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.562105 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-scripts" (OuterVolumeSpecName: "scripts") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.562202 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc408da7-9bc6-4d4f-b288-279a86149a82-kube-api-access-ztwxj" (OuterVolumeSpecName: "kube-api-access-ztwxj") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "kube-api-access-ztwxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.562457 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-scripts" (OuterVolumeSpecName: "scripts") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.562500 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.564066 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.565687 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.567184 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33554e0e-766e-49ba-a811-83d941577557-kube-api-access-jtrxt" (OuterVolumeSpecName: "kube-api-access-jtrxt") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "kube-api-access-jtrxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.573173 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.602807 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-config-data" (OuterVolumeSpecName: "config-data") pod "33554e0e-766e-49ba-a811-83d941577557" (UID: "33554e0e-766e-49ba-a811-83d941577557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.605973 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-config-data" (OuterVolumeSpecName: "config-data") pod "fc408da7-9bc6-4d4f-b288-279a86149a82" (UID: "fc408da7-9bc6-4d4f-b288-279a86149a82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657747 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657819 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657834 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtrxt\" (UniqueName: \"kubernetes.io/projected/33554e0e-766e-49ba-a811-83d941577557-kube-api-access-jtrxt\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657849 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657860 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657872 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc408da7-9bc6-4d4f-b288-279a86149a82-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657886 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/33554e0e-766e-49ba-a811-83d941577557-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657901 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657915 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fc408da7-9bc6-4d4f-b288-279a86149a82-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657941 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657956 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33554e0e-766e-49ba-a811-83d941577557-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657971 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc408da7-9bc6-4d4f-b288-279a86149a82-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.657992 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.658008 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztwxj\" (UniqueName: \"kubernetes.io/projected/fc408da7-9bc6-4d4f-b288-279a86149a82-kube-api-access-ztwxj\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.658030 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.675787 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.677354 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.692855 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.695479 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.759036 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.759073 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.759085 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.759097 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.843828 4761 generic.go:334] "Generic (PLEG): container finished" podID="33554e0e-766e-49ba-a811-83d941577557" containerID="dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880" exitCode=0 Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.843914 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.843974 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"33554e0e-766e-49ba-a811-83d941577557","Type":"ContainerDied","Data":"dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880"} Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.844017 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"33554e0e-766e-49ba-a811-83d941577557","Type":"ContainerDied","Data":"7fc8a0a400d7161ead1af25f1c511e3ec4bb77b5832ce79654441e33e8226bed"} Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.844067 4761 scope.go:117] "RemoveContainer" containerID="dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.846769 4761 generic.go:334] "Generic (PLEG): container finished" podID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerID="f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357" exitCode=0 Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.846854 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.847161 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"fc408da7-9bc6-4d4f-b288-279a86149a82","Type":"ContainerDied","Data":"f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357"} Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.847364 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"fc408da7-9bc6-4d4f-b288-279a86149a82","Type":"ContainerDied","Data":"5c642155a422e855111aa99bf3f169e1c29c44f85f6da4eaefcf1f378c955ab0"} Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.869590 4761 scope.go:117] "RemoveContainer" containerID="85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.884426 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.898577 4761 scope.go:117] "RemoveContainer" containerID="dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880" Dec 01 10:54:39 crc kubenswrapper[4761]: E1201 10:54:39.899092 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880\": container with ID starting with dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880 not found: ID does not exist" containerID="dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.899144 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880"} err="failed to get container status \"dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880\": rpc error: code = NotFound desc = could not find container \"dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880\": container with ID starting with dc30d99dcef001babf91b446048a651bf013ae2cc3a6a6b2b771f350deec1880 not found: ID does not exist" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.899187 4761 scope.go:117] "RemoveContainer" containerID="85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a" Dec 01 10:54:39 crc kubenswrapper[4761]: E1201 10:54:39.899502 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a\": container with ID starting with 85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a not found: ID does not exist" containerID="85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.899563 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a"} err="failed to get container status \"85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a\": rpc error: code = NotFound desc = could not find container \"85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a\": container with ID starting with 85554cff8542352ead6956f4f3ba0d5a723b689f19b89e9fc9babe7733829a4a not found: ID does not exist" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.899589 4761 scope.go:117] "RemoveContainer" containerID="f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.903827 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.919841 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.926939 4761 scope.go:117] "RemoveContainer" containerID="da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.931050 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.948180 4761 scope.go:117] "RemoveContainer" containerID="f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357" Dec 01 10:54:39 crc kubenswrapper[4761]: E1201 10:54:39.948633 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357\": container with ID starting with f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357 not found: ID does not exist" containerID="f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.948667 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357"} err="failed to get container status \"f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357\": rpc error: code = NotFound desc = could not find container \"f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357\": container with ID starting with f854d763099ced77d99ef86f8385e537a0076b35717d2b9ee92f87c08e5ad357 not found: ID does not exist" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.948693 4761 scope.go:117] "RemoveContainer" containerID="da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b" Dec 01 10:54:39 crc kubenswrapper[4761]: E1201 10:54:39.949047 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b\": container with ID starting with da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b not found: ID does not exist" containerID="da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b" Dec 01 10:54:39 crc kubenswrapper[4761]: I1201 10:54:39.949092 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b"} err="failed to get container status \"da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b\": rpc error: code = NotFound desc = could not find container \"da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b\": container with ID starting with da4bb04afc50de230c3f3021347bb9768c816515d9535af18a2202d44514472b not found: ID does not exist" Dec 01 10:54:41 crc kubenswrapper[4761]: I1201 10:54:41.000152 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:41 crc kubenswrapper[4761]: I1201 10:54:41.000530 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerName="glance-log" containerID="cri-o://d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1" gracePeriod=30 Dec 01 10:54:41 crc kubenswrapper[4761]: I1201 10:54:41.000650 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerName="glance-httpd" containerID="cri-o://6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94" gracePeriod=30 Dec 01 10:54:41 crc kubenswrapper[4761]: I1201 10:54:41.137727 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33554e0e-766e-49ba-a811-83d941577557" path="/var/lib/kubelet/pods/33554e0e-766e-49ba-a811-83d941577557/volumes" Dec 01 10:54:41 crc kubenswrapper[4761]: I1201 10:54:41.138687 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" path="/var/lib/kubelet/pods/fc408da7-9bc6-4d4f-b288-279a86149a82/volumes" Dec 01 10:54:41 crc kubenswrapper[4761]: I1201 10:54:41.876786 4761 generic.go:334] "Generic (PLEG): container finished" podID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerID="d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1" exitCode=143 Dec 01 10:54:41 crc kubenswrapper[4761]: I1201 10:54:41.876851 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9","Type":"ContainerDied","Data":"d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1"} Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.589730 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.636970 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.637681 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-sys\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.638048 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.638319 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-iscsi\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.638518 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-run\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.638765 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-httpd-run\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.639059 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-dev\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.639246 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhpb5\" (UniqueName: \"kubernetes.io/projected/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-kube-api-access-xhpb5\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.639437 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-scripts\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.639284 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-dev" (OuterVolumeSpecName: "dev") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.639329 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-sys" (OuterVolumeSpecName: "sys") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.639835 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.640453 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-run" (OuterVolumeSpecName: "run") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.640822 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.641054 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-lib-modules\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.641236 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.641462 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-logs\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.641715 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-nvme\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.641901 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.642070 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-logs" (OuterVolumeSpecName: "logs") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.642662 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-config-data\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.642866 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-var-locks-brick\") pod \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\" (UID: \"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9\") " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.643019 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.643920 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.644091 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.644225 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.644353 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.644508 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.644714 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.644912 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.645102 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.645286 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.680873 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-kube-api-access-xhpb5" (OuterVolumeSpecName: "kube-api-access-xhpb5") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "kube-api-access-xhpb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.692463 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-scripts" (OuterVolumeSpecName: "scripts") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.692858 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.694776 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.717650 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-config-data" (OuterVolumeSpecName: "config-data") pod "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" (UID: "77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.746429 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhpb5\" (UniqueName: \"kubernetes.io/projected/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-kube-api-access-xhpb5\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.746735 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.746820 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.746921 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.747010 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.765495 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.765845 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.848378 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.848415 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.910809 4761 generic.go:334] "Generic (PLEG): container finished" podID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerID="6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94" exitCode=0 Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.910857 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9","Type":"ContainerDied","Data":"6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94"} Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.910888 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9","Type":"ContainerDied","Data":"f0ebbe5e65d5b9ff01ae86c34f35ea70a3e4568dbcf6e13b0c9b9fcc5f849938"} Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.910908 4761 scope.go:117] "RemoveContainer" containerID="6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.911042 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.962403 4761 scope.go:117] "RemoveContainer" containerID="d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.962536 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.977832 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.992099 4761 scope.go:117] "RemoveContainer" containerID="6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94" Dec 01 10:54:44 crc kubenswrapper[4761]: E1201 10:54:44.992962 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94\": container with ID starting with 6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94 not found: ID does not exist" containerID="6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.993015 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94"} err="failed to get container status \"6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94\": rpc error: code = NotFound desc = could not find container \"6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94\": container with ID starting with 6fa895aad408c7c724e016bbf6446a1056311152a5341c22b1ed0cf12eda2d94 not found: ID does not exist" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.993044 4761 scope.go:117] "RemoveContainer" containerID="d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1" Dec 01 10:54:44 crc kubenswrapper[4761]: E1201 10:54:44.993344 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1\": container with ID starting with d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1 not found: ID does not exist" containerID="d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1" Dec 01 10:54:44 crc kubenswrapper[4761]: I1201 10:54:44.993364 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1"} err="failed to get container status \"d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1\": rpc error: code = NotFound desc = could not find container \"d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1\": container with ID starting with d90e768d88b64b3568071f7fb13ca36f7a4ad625e3c6886cf2ba60f98fb82fb1 not found: ID does not exist" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.139291 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" path="/var/lib/kubelet/pods/77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9/volumes" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.378523 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-hjf46"] Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.384535 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-hjf46"] Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.413202 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance5333-account-delete-fn85r"] Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.414104 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33554e0e-766e-49ba-a811-83d941577557" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.414198 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="33554e0e-766e-49ba-a811-83d941577557" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.414276 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33554e0e-766e-49ba-a811-83d941577557" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.414354 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="33554e0e-766e-49ba-a811-83d941577557" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.414433 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.414504 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.414598 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.414669 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.414752 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="extract-utilities" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.414843 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="extract-utilities" Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.414927 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="extract-content" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.414993 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="extract-content" Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.415056 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="registry-server" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.415107 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="registry-server" Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.415157 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.415215 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: E1201 10:54:45.415270 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.415344 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.415594 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.415683 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.415758 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fde0e2-cc4a-4dd9-9da9-b2f8ffdce5c9" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.415844 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc408da7-9bc6-4d4f-b288-279a86149a82" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.415929 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="33554e0e-766e-49ba-a811-83d941577557" containerName="glance-httpd" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.416006 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1396c61b-86e8-41e3-90d3-76c88b8c7994" containerName="registry-server" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.416081 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="33554e0e-766e-49ba-a811-83d941577557" containerName="glance-log" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.416723 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.424044 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance5333-account-delete-fn85r"] Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.456155 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-operator-scripts\") pod \"glance5333-account-delete-fn85r\" (UID: \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\") " pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.456216 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftbd2\" (UniqueName: \"kubernetes.io/projected/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-kube-api-access-ftbd2\") pod \"glance5333-account-delete-fn85r\" (UID: \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\") " pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.557934 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-operator-scripts\") pod \"glance5333-account-delete-fn85r\" (UID: \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\") " pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.558505 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftbd2\" (UniqueName: \"kubernetes.io/projected/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-kube-api-access-ftbd2\") pod \"glance5333-account-delete-fn85r\" (UID: \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\") " pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.558748 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-operator-scripts\") pod \"glance5333-account-delete-fn85r\" (UID: \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\") " pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.578755 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftbd2\" (UniqueName: \"kubernetes.io/projected/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-kube-api-access-ftbd2\") pod \"glance5333-account-delete-fn85r\" (UID: \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\") " pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:45 crc kubenswrapper[4761]: I1201 10:54:45.741800 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:46 crc kubenswrapper[4761]: I1201 10:54:46.157359 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance5333-account-delete-fn85r"] Dec 01 10:54:46 crc kubenswrapper[4761]: W1201 10:54:46.164962 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c3f3872_dd47_4a0c_a8e6_88bbb861cc4b.slice/crio-03516502f0a128d47fe5dc08c56f88b668f1e10ac32dc76a43c13a8d917df814 WatchSource:0}: Error finding container 03516502f0a128d47fe5dc08c56f88b668f1e10ac32dc76a43c13a8d917df814: Status 404 returned error can't find the container with id 03516502f0a128d47fe5dc08c56f88b668f1e10ac32dc76a43c13a8d917df814 Dec 01 10:54:46 crc kubenswrapper[4761]: I1201 10:54:46.927360 4761 generic.go:334] "Generic (PLEG): container finished" podID="2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b" containerID="c07b9f8aa85340d67c1da884aeabeee7fe06be2221e8ccf267e75e75eb3832f9" exitCode=0 Dec 01 10:54:46 crc kubenswrapper[4761]: I1201 10:54:46.927565 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5333-account-delete-fn85r" event={"ID":"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b","Type":"ContainerDied","Data":"c07b9f8aa85340d67c1da884aeabeee7fe06be2221e8ccf267e75e75eb3832f9"} Dec 01 10:54:46 crc kubenswrapper[4761]: I1201 10:54:46.927664 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5333-account-delete-fn85r" event={"ID":"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b","Type":"ContainerStarted","Data":"03516502f0a128d47fe5dc08c56f88b668f1e10ac32dc76a43c13a8d917df814"} Dec 01 10:54:47 crc kubenswrapper[4761]: I1201 10:54:47.138293 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e8b1045-72b1-4dd5-a7b0-f58713505cb7" path="/var/lib/kubelet/pods/0e8b1045-72b1-4dd5-a7b0-f58713505cb7/volumes" Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.246756 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.299494 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-operator-scripts\") pod \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\" (UID: \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\") " Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.299672 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftbd2\" (UniqueName: \"kubernetes.io/projected/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-kube-api-access-ftbd2\") pod \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\" (UID: \"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b\") " Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.300728 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b" (UID: "2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.307830 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-kube-api-access-ftbd2" (OuterVolumeSpecName: "kube-api-access-ftbd2") pod "2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b" (UID: "2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b"). InnerVolumeSpecName "kube-api-access-ftbd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.402041 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.402101 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftbd2\" (UniqueName: \"kubernetes.io/projected/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b-kube-api-access-ftbd2\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.962683 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5333-account-delete-fn85r" event={"ID":"2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b","Type":"ContainerDied","Data":"03516502f0a128d47fe5dc08c56f88b668f1e10ac32dc76a43c13a8d917df814"} Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.962729 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03516502f0a128d47fe5dc08c56f88b668f1e10ac32dc76a43c13a8d917df814" Dec 01 10:54:48 crc kubenswrapper[4761]: I1201 10:54:48.962772 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5333-account-delete-fn85r" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.176034 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:54:49 crc kubenswrapper[4761]: E1201 10:54:49.176368 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b" containerName="mariadb-account-delete" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.176389 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b" containerName="mariadb-account-delete" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.176568 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b" containerName="mariadb-account-delete" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.177008 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.179112 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.181585 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.181704 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-4g9l2" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.182068 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.191800 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.213637 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.213700 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s59g\" (UniqueName: \"kubernetes.io/projected/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-kube-api-access-2s59g\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.213736 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.213887 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-scripts\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.315634 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-scripts\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.315738 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.315808 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s59g\" (UniqueName: \"kubernetes.io/projected/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-kube-api-access-2s59g\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.315883 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.316766 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-scripts\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.316823 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.321080 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.331967 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s59g\" (UniqueName: \"kubernetes.io/projected/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-kube-api-access-2s59g\") pod \"openstackclient\" (UID: \"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3\") " pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.494052 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Dec 01 10:54:49 crc kubenswrapper[4761]: I1201 10:54:49.954011 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Dec 01 10:54:49 crc kubenswrapper[4761]: W1201 10:54:49.969485 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ad227ac_66b6_4a9d_b5a8_adbf86fb8ba3.slice/crio-06d398a99c47362fba5039d7c807eb19e4da1c0ab06014d5f1ffe24aadf01f1e WatchSource:0}: Error finding container 06d398a99c47362fba5039d7c807eb19e4da1c0ab06014d5f1ffe24aadf01f1e: Status 404 returned error can't find the container with id 06d398a99c47362fba5039d7c807eb19e4da1c0ab06014d5f1ffe24aadf01f1e Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.453006 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-h2nvr"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.461744 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-h2nvr"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.472247 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-5333-account-create-update-xkdj2"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.477791 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance5333-account-delete-fn85r"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.485918 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-5333-account-create-update-xkdj2"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.492769 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance5333-account-delete-fn85r"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.526519 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-jhwd8"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.527367 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.537613 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-jhwd8"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.631106 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.632190 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.634074 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-operator-scripts\") pod \"glance-db-create-jhwd8\" (UID: \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\") " pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.634199 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj78j\" (UniqueName: \"kubernetes.io/projected/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-kube-api-access-dj78j\") pod \"glance-db-create-jhwd8\" (UID: \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\") " pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.636988 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.641285 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g"] Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.735666 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72e3e267-9018-433d-95ec-0456c9fef8da-operator-scripts\") pod \"glance-a9d5-account-create-update-8hw8g\" (UID: \"72e3e267-9018-433d-95ec-0456c9fef8da\") " pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.736059 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj78j\" (UniqueName: \"kubernetes.io/projected/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-kube-api-access-dj78j\") pod \"glance-db-create-jhwd8\" (UID: \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\") " pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.736236 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-operator-scripts\") pod \"glance-db-create-jhwd8\" (UID: \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\") " pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.736334 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b2vh\" (UniqueName: \"kubernetes.io/projected/72e3e267-9018-433d-95ec-0456c9fef8da-kube-api-access-6b2vh\") pod \"glance-a9d5-account-create-update-8hw8g\" (UID: \"72e3e267-9018-433d-95ec-0456c9fef8da\") " pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.737483 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-operator-scripts\") pod \"glance-db-create-jhwd8\" (UID: \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\") " pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.755265 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj78j\" (UniqueName: \"kubernetes.io/projected/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-kube-api-access-dj78j\") pod \"glance-db-create-jhwd8\" (UID: \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\") " pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.839293 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b2vh\" (UniqueName: \"kubernetes.io/projected/72e3e267-9018-433d-95ec-0456c9fef8da-kube-api-access-6b2vh\") pod \"glance-a9d5-account-create-update-8hw8g\" (UID: \"72e3e267-9018-433d-95ec-0456c9fef8da\") " pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.839587 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72e3e267-9018-433d-95ec-0456c9fef8da-operator-scripts\") pod \"glance-a9d5-account-create-update-8hw8g\" (UID: \"72e3e267-9018-433d-95ec-0456c9fef8da\") " pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.840343 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72e3e267-9018-433d-95ec-0456c9fef8da-operator-scripts\") pod \"glance-a9d5-account-create-update-8hw8g\" (UID: \"72e3e267-9018-433d-95ec-0456c9fef8da\") " pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.853427 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.865245 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b2vh\" (UniqueName: \"kubernetes.io/projected/72e3e267-9018-433d-95ec-0456c9fef8da-kube-api-access-6b2vh\") pod \"glance-a9d5-account-create-update-8hw8g\" (UID: \"72e3e267-9018-433d-95ec-0456c9fef8da\") " pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:50 crc kubenswrapper[4761]: I1201 10:54:50.946759 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:51 crc kubenswrapper[4761]: I1201 10:54:51.008265 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3","Type":"ContainerStarted","Data":"65876a6b5c280d404cd4861aa4194cc669094e37ff9f37db6c30b6be70a11c03"} Dec 01 10:54:51 crc kubenswrapper[4761]: I1201 10:54:51.008318 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3","Type":"ContainerStarted","Data":"06d398a99c47362fba5039d7c807eb19e4da1c0ab06014d5f1ffe24aadf01f1e"} Dec 01 10:54:51 crc kubenswrapper[4761]: I1201 10:54:51.183312 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0db33355-a2b7-4291-85f9-2b5ad6f11d8b" path="/var/lib/kubelet/pods/0db33355-a2b7-4291-85f9-2b5ad6f11d8b/volumes" Dec 01 10:54:51 crc kubenswrapper[4761]: I1201 10:54:51.184213 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b" path="/var/lib/kubelet/pods/2c3f3872-dd47-4a0c-a8e6-88bbb861cc4b/volumes" Dec 01 10:54:51 crc kubenswrapper[4761]: I1201 10:54:51.185220 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6757cc2e-75a3-4e72-86a9-116cf82ee293" path="/var/lib/kubelet/pods/6757cc2e-75a3-4e72-86a9-116cf82ee293/volumes" Dec 01 10:54:51 crc kubenswrapper[4761]: I1201 10:54:51.235571 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.235537748 podStartE2EDuration="2.235537748s" podCreationTimestamp="2025-12-01 10:54:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:54:51.031592325 +0000 UTC m=+1430.335350949" watchObservedRunningTime="2025-12-01 10:54:51.235537748 +0000 UTC m=+1430.539296362" Dec 01 10:54:51 crc kubenswrapper[4761]: I1201 10:54:51.237814 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g"] Dec 01 10:54:51 crc kubenswrapper[4761]: W1201 10:54:51.245301 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72e3e267_9018_433d_95ec_0456c9fef8da.slice/crio-a672400abdf283ca7c42b60fe0bc29c957793479370318f55eae64a12b0a718f WatchSource:0}: Error finding container a672400abdf283ca7c42b60fe0bc29c957793479370318f55eae64a12b0a718f: Status 404 returned error can't find the container with id a672400abdf283ca7c42b60fe0bc29c957793479370318f55eae64a12b0a718f Dec 01 10:54:51 crc kubenswrapper[4761]: I1201 10:54:51.367469 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-jhwd8"] Dec 01 10:54:51 crc kubenswrapper[4761]: W1201 10:54:51.376181 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd76b57f2_c1b1_43ab_bbb5_0c8411d0f8d0.slice/crio-8ed0748ed6701353a356411c1bf5987e736a10b23febcf8c5906e9745035cf2a WatchSource:0}: Error finding container 8ed0748ed6701353a356411c1bf5987e736a10b23febcf8c5906e9745035cf2a: Status 404 returned error can't find the container with id 8ed0748ed6701353a356411c1bf5987e736a10b23febcf8c5906e9745035cf2a Dec 01 10:54:51 crc kubenswrapper[4761]: E1201 10:54:51.885080 4761 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72e3e267_9018_433d_95ec_0456c9fef8da.slice/crio-85ae3cc8dd4564eb71c547ba412e90eebc1add08b5e29ff93fabb2de80f6d76a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72e3e267_9018_433d_95ec_0456c9fef8da.slice/crio-conmon-85ae3cc8dd4564eb71c547ba412e90eebc1add08b5e29ff93fabb2de80f6d76a.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:54:52 crc kubenswrapper[4761]: I1201 10:54:52.019026 4761 generic.go:334] "Generic (PLEG): container finished" podID="d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0" containerID="c92a66e140e324f99840149aef24d020cd60d5a14504733d37f2e0c3928548a9" exitCode=0 Dec 01 10:54:52 crc kubenswrapper[4761]: I1201 10:54:52.019139 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jhwd8" event={"ID":"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0","Type":"ContainerDied","Data":"c92a66e140e324f99840149aef24d020cd60d5a14504733d37f2e0c3928548a9"} Dec 01 10:54:52 crc kubenswrapper[4761]: I1201 10:54:52.019470 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jhwd8" event={"ID":"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0","Type":"ContainerStarted","Data":"8ed0748ed6701353a356411c1bf5987e736a10b23febcf8c5906e9745035cf2a"} Dec 01 10:54:52 crc kubenswrapper[4761]: I1201 10:54:52.021191 4761 generic.go:334] "Generic (PLEG): container finished" podID="72e3e267-9018-433d-95ec-0456c9fef8da" containerID="85ae3cc8dd4564eb71c547ba412e90eebc1add08b5e29ff93fabb2de80f6d76a" exitCode=0 Dec 01 10:54:52 crc kubenswrapper[4761]: I1201 10:54:52.021269 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" event={"ID":"72e3e267-9018-433d-95ec-0456c9fef8da","Type":"ContainerDied","Data":"85ae3cc8dd4564eb71c547ba412e90eebc1add08b5e29ff93fabb2de80f6d76a"} Dec 01 10:54:52 crc kubenswrapper[4761]: I1201 10:54:52.021311 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" event={"ID":"72e3e267-9018-433d-95ec-0456c9fef8da","Type":"ContainerStarted","Data":"a672400abdf283ca7c42b60fe0bc29c957793479370318f55eae64a12b0a718f"} Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.427044 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.431443 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.480874 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b2vh\" (UniqueName: \"kubernetes.io/projected/72e3e267-9018-433d-95ec-0456c9fef8da-kube-api-access-6b2vh\") pod \"72e3e267-9018-433d-95ec-0456c9fef8da\" (UID: \"72e3e267-9018-433d-95ec-0456c9fef8da\") " Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.481130 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-operator-scripts\") pod \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\" (UID: \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\") " Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.481250 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj78j\" (UniqueName: \"kubernetes.io/projected/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-kube-api-access-dj78j\") pod \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\" (UID: \"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0\") " Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.481298 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72e3e267-9018-433d-95ec-0456c9fef8da-operator-scripts\") pod \"72e3e267-9018-433d-95ec-0456c9fef8da\" (UID: \"72e3e267-9018-433d-95ec-0456c9fef8da\") " Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.481941 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0" (UID: "d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.482236 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72e3e267-9018-433d-95ec-0456c9fef8da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72e3e267-9018-433d-95ec-0456c9fef8da" (UID: "72e3e267-9018-433d-95ec-0456c9fef8da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.485863 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-kube-api-access-dj78j" (OuterVolumeSpecName: "kube-api-access-dj78j") pod "d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0" (UID: "d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0"). InnerVolumeSpecName "kube-api-access-dj78j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.486885 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72e3e267-9018-433d-95ec-0456c9fef8da-kube-api-access-6b2vh" (OuterVolumeSpecName: "kube-api-access-6b2vh") pod "72e3e267-9018-433d-95ec-0456c9fef8da" (UID: "72e3e267-9018-433d-95ec-0456c9fef8da"). InnerVolumeSpecName "kube-api-access-6b2vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.583164 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b2vh\" (UniqueName: \"kubernetes.io/projected/72e3e267-9018-433d-95ec-0456c9fef8da-kube-api-access-6b2vh\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.583201 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.583210 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj78j\" (UniqueName: \"kubernetes.io/projected/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0-kube-api-access-dj78j\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:53 crc kubenswrapper[4761]: I1201 10:54:53.583218 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72e3e267-9018-433d-95ec-0456c9fef8da-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:54:54 crc kubenswrapper[4761]: I1201 10:54:54.044272 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" Dec 01 10:54:54 crc kubenswrapper[4761]: I1201 10:54:54.044276 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g" event={"ID":"72e3e267-9018-433d-95ec-0456c9fef8da","Type":"ContainerDied","Data":"a672400abdf283ca7c42b60fe0bc29c957793479370318f55eae64a12b0a718f"} Dec 01 10:54:54 crc kubenswrapper[4761]: I1201 10:54:54.044395 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a672400abdf283ca7c42b60fe0bc29c957793479370318f55eae64a12b0a718f" Dec 01 10:54:54 crc kubenswrapper[4761]: I1201 10:54:54.050249 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jhwd8" event={"ID":"d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0","Type":"ContainerDied","Data":"8ed0748ed6701353a356411c1bf5987e736a10b23febcf8c5906e9745035cf2a"} Dec 01 10:54:54 crc kubenswrapper[4761]: I1201 10:54:54.050275 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ed0748ed6701353a356411c1bf5987e736a10b23febcf8c5906e9745035cf2a" Dec 01 10:54:54 crc kubenswrapper[4761]: I1201 10:54:54.050332 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jhwd8" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.776606 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-cjld5"] Dec 01 10:54:55 crc kubenswrapper[4761]: E1201 10:54:55.777162 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e3e267-9018-433d-95ec-0456c9fef8da" containerName="mariadb-account-create-update" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.777176 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e3e267-9018-433d-95ec-0456c9fef8da" containerName="mariadb-account-create-update" Dec 01 10:54:55 crc kubenswrapper[4761]: E1201 10:54:55.777192 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0" containerName="mariadb-database-create" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.777199 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0" containerName="mariadb-database-create" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.777371 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0" containerName="mariadb-database-create" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.777402 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="72e3e267-9018-433d-95ec-0456c9fef8da" containerName="mariadb-account-create-update" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.777933 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.780080 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-sjf2q" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.780179 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.794382 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-cjld5"] Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.821353 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-db-sync-config-data\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.821435 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9v2g\" (UniqueName: \"kubernetes.io/projected/43f7736e-ab16-4070-a95c-e13f2681b346-kube-api-access-s9v2g\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.821490 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-config-data\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.923244 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9v2g\" (UniqueName: \"kubernetes.io/projected/43f7736e-ab16-4070-a95c-e13f2681b346-kube-api-access-s9v2g\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.923312 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-config-data\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.923395 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-db-sync-config-data\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.937096 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-db-sync-config-data\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.937276 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-config-data\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:55 crc kubenswrapper[4761]: I1201 10:54:55.950907 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9v2g\" (UniqueName: \"kubernetes.io/projected/43f7736e-ab16-4070-a95c-e13f2681b346-kube-api-access-s9v2g\") pod \"glance-db-sync-cjld5\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:56 crc kubenswrapper[4761]: I1201 10:54:56.099605 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:54:56 crc kubenswrapper[4761]: I1201 10:54:56.414140 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-cjld5"] Dec 01 10:54:56 crc kubenswrapper[4761]: W1201 10:54:56.423335 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43f7736e_ab16_4070_a95c_e13f2681b346.slice/crio-6b0b07f550cf0d5e8e02cf8d130bc691b1baa98c0cc73d381bda81ef86310a22 WatchSource:0}: Error finding container 6b0b07f550cf0d5e8e02cf8d130bc691b1baa98c0cc73d381bda81ef86310a22: Status 404 returned error can't find the container with id 6b0b07f550cf0d5e8e02cf8d130bc691b1baa98c0cc73d381bda81ef86310a22 Dec 01 10:54:57 crc kubenswrapper[4761]: I1201 10:54:57.106308 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-cjld5" event={"ID":"43f7736e-ab16-4070-a95c-e13f2681b346","Type":"ContainerStarted","Data":"4bf439d53e72d7d5552ffba6b5ff6f5739d05ee4e3cc58da728fd6f38dc59a69"} Dec 01 10:54:57 crc kubenswrapper[4761]: I1201 10:54:57.106672 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-cjld5" event={"ID":"43f7736e-ab16-4070-a95c-e13f2681b346","Type":"ContainerStarted","Data":"6b0b07f550cf0d5e8e02cf8d130bc691b1baa98c0cc73d381bda81ef86310a22"} Dec 01 10:54:57 crc kubenswrapper[4761]: I1201 10:54:57.132976 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-cjld5" podStartSLOduration=2.132962186 podStartE2EDuration="2.132962186s" podCreationTimestamp="2025-12-01 10:54:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:54:57.126513754 +0000 UTC m=+1436.430272398" watchObservedRunningTime="2025-12-01 10:54:57.132962186 +0000 UTC m=+1436.436720810" Dec 01 10:55:00 crc kubenswrapper[4761]: I1201 10:55:00.142458 4761 generic.go:334] "Generic (PLEG): container finished" podID="43f7736e-ab16-4070-a95c-e13f2681b346" containerID="4bf439d53e72d7d5552ffba6b5ff6f5739d05ee4e3cc58da728fd6f38dc59a69" exitCode=0 Dec 01 10:55:00 crc kubenswrapper[4761]: I1201 10:55:00.142610 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-cjld5" event={"ID":"43f7736e-ab16-4070-a95c-e13f2681b346","Type":"ContainerDied","Data":"4bf439d53e72d7d5552ffba6b5ff6f5739d05ee4e3cc58da728fd6f38dc59a69"} Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.457870 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.546631 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9v2g\" (UniqueName: \"kubernetes.io/projected/43f7736e-ab16-4070-a95c-e13f2681b346-kube-api-access-s9v2g\") pod \"43f7736e-ab16-4070-a95c-e13f2681b346\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.547079 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-db-sync-config-data\") pod \"43f7736e-ab16-4070-a95c-e13f2681b346\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.547183 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-config-data\") pod \"43f7736e-ab16-4070-a95c-e13f2681b346\" (UID: \"43f7736e-ab16-4070-a95c-e13f2681b346\") " Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.555943 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f7736e-ab16-4070-a95c-e13f2681b346-kube-api-access-s9v2g" (OuterVolumeSpecName: "kube-api-access-s9v2g") pod "43f7736e-ab16-4070-a95c-e13f2681b346" (UID: "43f7736e-ab16-4070-a95c-e13f2681b346"). InnerVolumeSpecName "kube-api-access-s9v2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.555964 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "43f7736e-ab16-4070-a95c-e13f2681b346" (UID: "43f7736e-ab16-4070-a95c-e13f2681b346"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.600687 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-config-data" (OuterVolumeSpecName: "config-data") pod "43f7736e-ab16-4070-a95c-e13f2681b346" (UID: "43f7736e-ab16-4070-a95c-e13f2681b346"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.649375 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.649404 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9v2g\" (UniqueName: \"kubernetes.io/projected/43f7736e-ab16-4070-a95c-e13f2681b346-kube-api-access-s9v2g\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:01 crc kubenswrapper[4761]: I1201 10:55:01.649414 4761 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43f7736e-ab16-4070-a95c-e13f2681b346-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:02 crc kubenswrapper[4761]: I1201 10:55:02.162373 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-cjld5" event={"ID":"43f7736e-ab16-4070-a95c-e13f2681b346","Type":"ContainerDied","Data":"6b0b07f550cf0d5e8e02cf8d130bc691b1baa98c0cc73d381bda81ef86310a22"} Dec 01 10:55:02 crc kubenswrapper[4761]: I1201 10:55:02.162417 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b0b07f550cf0d5e8e02cf8d130bc691b1baa98c0cc73d381bda81ef86310a22" Dec 01 10:55:02 crc kubenswrapper[4761]: I1201 10:55:02.162446 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-cjld5" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.398762 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:55:03 crc kubenswrapper[4761]: E1201 10:55:03.399598 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f7736e-ab16-4070-a95c-e13f2681b346" containerName="glance-db-sync" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.399630 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f7736e-ab16-4070-a95c-e13f2681b346" containerName="glance-db-sync" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.399903 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f7736e-ab16-4070-a95c-e13f2681b346" containerName="glance-db-sync" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.401312 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.405706 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.405884 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.413537 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.417593 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-sjf2q" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505394 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-run\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505454 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-logs\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505486 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xcj9\" (UniqueName: \"kubernetes.io/projected/87789d6a-0bcb-4bac-86a4-97a57045c3bc-kube-api-access-5xcj9\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505507 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505531 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505603 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-dev\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505635 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505680 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-config-data\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505703 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505723 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-sys\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505742 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505766 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-scripts\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505803 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.505842 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.606114 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.606997 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607066 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607107 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-run\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607151 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-logs\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607170 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607189 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xcj9\" (UniqueName: \"kubernetes.io/projected/87789d6a-0bcb-4bac-86a4-97a57045c3bc-kube-api-access-5xcj9\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607255 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607285 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607415 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-dev\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607480 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607509 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607522 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-run\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607564 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607610 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607617 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-config-data\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607639 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-dev\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607653 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607679 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607699 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-sys\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607729 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-scripts\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607915 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.607932 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.608016 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-logs\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.608083 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.608279 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.608361 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-sys\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.610512 4761 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.616301 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-scripts\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.616383 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-config-data\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.631167 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xcj9\" (UniqueName: \"kubernetes.io/projected/87789d6a-0bcb-4bac-86a4-97a57045c3bc-kube-api-access-5xcj9\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.634153 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.641651 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.656811 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.668043 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.669718 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.681685 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708469 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708524 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-dev\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708594 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708613 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708630 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-run\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708665 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msgp4\" (UniqueName: \"kubernetes.io/projected/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-kube-api-access-msgp4\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708682 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708708 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708752 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708775 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708848 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708866 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708891 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.708926 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-sys\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.711331 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.712489 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.718829 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.722420 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.809920 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810243 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-sys\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810266 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-config-data\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810292 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810308 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810321 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-run\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810320 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-sys\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810336 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-logs\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810354 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810394 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810502 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-dev\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810570 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmgdq\" (UniqueName: \"kubernetes.io/projected/2cd1ced1-a073-414c-832a-1d2c443342ae-kube-api-access-kmgdq\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810609 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-dev\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810630 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgcgx\" (UniqueName: \"kubernetes.io/projected/d57f30e9-7c56-4ac4-9124-089ea591304f-kube-api-access-vgcgx\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810648 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810664 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-logs\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810667 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-dev\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810623 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810679 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-sys\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810740 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-dev\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810755 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810776 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810796 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810811 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810829 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810847 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810863 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810882 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-run\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810898 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-sys\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810914 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msgp4\" (UniqueName: \"kubernetes.io/projected/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-kube-api-access-msgp4\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810929 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810944 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-config-data\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810956 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-scripts\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810969 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.810995 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811012 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-scripts\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811030 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811045 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811070 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811100 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811113 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-run\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811132 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811149 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811169 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811187 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811242 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.811685 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.812024 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-run\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.812301 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.812373 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.812399 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.812402 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.812652 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.830619 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.831020 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.837225 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msgp4\" (UniqueName: \"kubernetes.io/projected/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-kube-api-access-msgp4\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.854133 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.857805 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.857882 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.858206 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.912820 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.912865 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-run\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.912919 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-run\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.912975 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.912997 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913018 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-config-data\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913051 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913076 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913096 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-run\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913116 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913136 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-logs\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913160 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-dev\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913181 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmgdq\" (UniqueName: \"kubernetes.io/projected/2cd1ced1-a073-414c-832a-1d2c443342ae-kube-api-access-kmgdq\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913217 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcgx\" (UniqueName: \"kubernetes.io/projected/d57f30e9-7c56-4ac4-9124-089ea591304f-kube-api-access-vgcgx\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913241 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-logs\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913259 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913275 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-sys\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913294 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-dev\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913311 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913332 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913352 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913368 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913384 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913408 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-sys\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913427 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-config-data\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913440 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-scripts\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913454 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913472 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.913490 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-scripts\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914444 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914444 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914512 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914527 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914591 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-dev\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914655 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-dev\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914726 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914748 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914752 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-logs\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914779 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914793 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-run\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.914819 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915085 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915125 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915279 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915333 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915465 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-sys\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915521 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-sys\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915538 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915751 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.915877 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-logs\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.917323 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-scripts\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.921966 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-scripts\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.922191 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-config-data\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.941052 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-config-data\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.943072 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgcgx\" (UniqueName: \"kubernetes.io/projected/d57f30e9-7c56-4ac4-9124-089ea591304f-kube-api-access-vgcgx\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.943534 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.943661 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmgdq\" (UniqueName: \"kubernetes.io/projected/2cd1ced1-a073-414c-832a-1d2c443342ae-kube-api-access-kmgdq\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.945597 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.971057 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.995819 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:03 crc kubenswrapper[4761]: I1201 10:55:03.997776 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.003305 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.006814 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.043910 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.183182 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"87789d6a-0bcb-4bac-86a4-97a57045c3bc","Type":"ContainerStarted","Data":"7799925df1befd027ab1bfc5bc44b250e99df2751f4cfb1440f3f8b226e5f020"} Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.183443 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"87789d6a-0bcb-4bac-86a4-97a57045c3bc","Type":"ContainerStarted","Data":"d10fba358e6249575b7a178f1abd2fb06036fcc934627b602f368dd0d6c36380"} Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.498662 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.549327 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.579249 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:04 crc kubenswrapper[4761]: W1201 10:55:04.593374 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb16973a7_ae65_40d0_b3aa_1341cfb4df0e.slice/crio-9b41ef7ea9a21fe797b7aed4bc2b5613f969ebfe943866c3c397d5c334d27be8 WatchSource:0}: Error finding container 9b41ef7ea9a21fe797b7aed4bc2b5613f969ebfe943866c3c397d5c334d27be8: Status 404 returned error can't find the container with id 9b41ef7ea9a21fe797b7aed4bc2b5613f969ebfe943866c3c397d5c334d27be8 Dec 01 10:55:04 crc kubenswrapper[4761]: I1201 10:55:04.653514 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.204988 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d57f30e9-7c56-4ac4-9124-089ea591304f","Type":"ContainerStarted","Data":"21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.205628 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d57f30e9-7c56-4ac4-9124-089ea591304f","Type":"ContainerStarted","Data":"fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.205648 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d57f30e9-7c56-4ac4-9124-089ea591304f","Type":"ContainerStarted","Data":"c0c8ecdc134a16c7ca47ee9d013c8a7f98405197f3abf4550db592a600de9948"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.217246 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerName="glance-log" containerID="cri-o://20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3" gracePeriod=30 Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.217597 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2cd1ced1-a073-414c-832a-1d2c443342ae","Type":"ContainerStarted","Data":"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.217627 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2cd1ced1-a073-414c-832a-1d2c443342ae","Type":"ContainerStarted","Data":"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.217641 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2cd1ced1-a073-414c-832a-1d2c443342ae","Type":"ContainerStarted","Data":"8bcf60c0c7d959fdd0edae698e8079cd2adf45c824a747d05d7178354514c5bb"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.217698 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerName="glance-httpd" containerID="cri-o://17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585" gracePeriod=30 Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.225470 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b16973a7-ae65-40d0-b3aa-1341cfb4df0e","Type":"ContainerStarted","Data":"d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.225507 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b16973a7-ae65-40d0-b3aa-1341cfb4df0e","Type":"ContainerStarted","Data":"e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.225523 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b16973a7-ae65-40d0-b3aa-1341cfb4df0e","Type":"ContainerStarted","Data":"9b41ef7ea9a21fe797b7aed4bc2b5613f969ebfe943866c3c397d5c334d27be8"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.230142 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"87789d6a-0bcb-4bac-86a4-97a57045c3bc","Type":"ContainerStarted","Data":"8f22fa8ce96c1829b6f6195e15938371b332506279c65c048124e901c586dccd"} Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.261299 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.261281949 podStartE2EDuration="3.261281949s" podCreationTimestamp="2025-12-01 10:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:55:05.241490684 +0000 UTC m=+1444.545249308" watchObservedRunningTime="2025-12-01 10:55:05.261281949 +0000 UTC m=+1444.565040573" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.262052 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.262044069 podStartE2EDuration="3.262044069s" podCreationTimestamp="2025-12-01 10:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:55:05.258815843 +0000 UTC m=+1444.562574467" watchObservedRunningTime="2025-12-01 10:55:05.262044069 +0000 UTC m=+1444.565802693" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.294362 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=2.294344276 podStartE2EDuration="2.294344276s" podCreationTimestamp="2025-12-01 10:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:55:05.289373744 +0000 UTC m=+1444.593132368" watchObservedRunningTime="2025-12-01 10:55:05.294344276 +0000 UTC m=+1444.598102900" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.319438 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.319420132 podStartE2EDuration="3.319420132s" podCreationTimestamp="2025-12-01 10:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:55:05.310867145 +0000 UTC m=+1444.614625759" watchObservedRunningTime="2025-12-01 10:55:05.319420132 +0000 UTC m=+1444.623178756" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.532961 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.639808 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-config-data\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640137 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-lib-modules\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640190 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmgdq\" (UniqueName: \"kubernetes.io/projected/2cd1ced1-a073-414c-832a-1d2c443342ae-kube-api-access-kmgdq\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640229 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-dev\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640254 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-nvme\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640288 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-run\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640312 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640337 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-sys\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640350 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-iscsi\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640371 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-var-locks-brick\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640414 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-scripts\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640449 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640479 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-logs\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.640512 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-httpd-run\") pod \"2cd1ced1-a073-414c-832a-1d2c443342ae\" (UID: \"2cd1ced1-a073-414c-832a-1d2c443342ae\") " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.641004 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.643443 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-dev" (OuterVolumeSpecName: "dev") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.643517 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.643539 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.643575 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-sys" (OuterVolumeSpecName: "sys") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.643595 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.645402 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.645458 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-run" (OuterVolumeSpecName: "run") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.645457 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-logs" (OuterVolumeSpecName: "logs") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.647751 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd1ced1-a073-414c-832a-1d2c443342ae-kube-api-access-kmgdq" (OuterVolumeSpecName: "kube-api-access-kmgdq") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "kube-api-access-kmgdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.647860 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.648140 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.650648 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-scripts" (OuterVolumeSpecName: "scripts") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.688963 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-config-data" (OuterVolumeSpecName: "config-data") pod "2cd1ced1-a073-414c-832a-1d2c443342ae" (UID: "2cd1ced1-a073-414c-832a-1d2c443342ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.744624 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.744856 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.744984 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745085 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745169 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd1ced1-a073-414c-832a-1d2c443342ae-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745247 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd1ced1-a073-414c-832a-1d2c443342ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745324 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745432 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmgdq\" (UniqueName: \"kubernetes.io/projected/2cd1ced1-a073-414c-832a-1d2c443342ae-kube-api-access-kmgdq\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745560 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745646 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745724 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.745818 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.749015 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.749091 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2cd1ced1-a073-414c-832a-1d2c443342ae-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.759730 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.761772 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.850593 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:05 crc kubenswrapper[4761]: I1201 10:55:05.850645 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.255700 4761 generic.go:334] "Generic (PLEG): container finished" podID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerID="17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585" exitCode=143 Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.257682 4761 generic.go:334] "Generic (PLEG): container finished" podID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerID="20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3" exitCode=143 Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.256230 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.256107 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2cd1ced1-a073-414c-832a-1d2c443342ae","Type":"ContainerDied","Data":"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585"} Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.273388 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2cd1ced1-a073-414c-832a-1d2c443342ae","Type":"ContainerDied","Data":"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3"} Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.273490 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"2cd1ced1-a073-414c-832a-1d2c443342ae","Type":"ContainerDied","Data":"8bcf60c0c7d959fdd0edae698e8079cd2adf45c824a747d05d7178354514c5bb"} Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.273589 4761 scope.go:117] "RemoveContainer" containerID="17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.313622 4761 scope.go:117] "RemoveContainer" containerID="20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.324667 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.331503 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.351813 4761 scope.go:117] "RemoveContainer" containerID="17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.353628 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:06 crc kubenswrapper[4761]: E1201 10:55:06.354013 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerName="glance-log" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.354027 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerName="glance-log" Dec 01 10:55:06 crc kubenswrapper[4761]: E1201 10:55:06.354074 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerName="glance-httpd" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.354082 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerName="glance-httpd" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.354241 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerName="glance-httpd" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.354258 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" containerName="glance-log" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.355208 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: E1201 10:55:06.358047 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585\": container with ID starting with 17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585 not found: ID does not exist" containerID="17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.358091 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585"} err="failed to get container status \"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585\": rpc error: code = NotFound desc = could not find container \"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585\": container with ID starting with 17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585 not found: ID does not exist" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.358115 4761 scope.go:117] "RemoveContainer" containerID="20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3" Dec 01 10:55:06 crc kubenswrapper[4761]: E1201 10:55:06.358651 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3\": container with ID starting with 20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3 not found: ID does not exist" containerID="20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.358677 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3"} err="failed to get container status \"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3\": rpc error: code = NotFound desc = could not find container \"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3\": container with ID starting with 20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3 not found: ID does not exist" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.358692 4761 scope.go:117] "RemoveContainer" containerID="17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.358890 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585"} err="failed to get container status \"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585\": rpc error: code = NotFound desc = could not find container \"17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585\": container with ID starting with 17a926d5e9f83a62ca83053bc6c0802e08f170eb936900b27de930e4839bc585 not found: ID does not exist" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.358934 4761 scope.go:117] "RemoveContainer" containerID="20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.359103 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3"} err="failed to get container status \"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3\": rpc error: code = NotFound desc = could not find container \"20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3\": container with ID starting with 20de3a8f4593be57975a3632ece3f9e2798ef52076f513532776b384b0fd18c3 not found: ID does not exist" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.360660 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465000 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465076 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465108 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-run\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465150 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cp8t\" (UniqueName: \"kubernetes.io/projected/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-kube-api-access-4cp8t\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465193 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465228 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-dev\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465249 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465272 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-sys\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465298 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465319 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465352 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465372 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465398 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.465423 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-logs\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566580 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566639 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-dev\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566659 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566676 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-sys\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566700 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566718 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566746 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566762 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566784 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566808 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-logs\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566831 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566835 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-sys\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566858 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566888 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-run\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566913 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-dev\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566917 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cp8t\" (UniqueName: \"kubernetes.io/projected/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-kube-api-access-4cp8t\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.567355 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.567524 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.567642 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.567744 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.568096 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-logs\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.568188 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.566886 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.571328 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.571328 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-run\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.573798 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-config-data\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.578496 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-scripts\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.587442 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cp8t\" (UniqueName: \"kubernetes.io/projected/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-kube-api-access-4cp8t\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.599940 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.610581 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:06 crc kubenswrapper[4761]: I1201 10:55:06.674044 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:07 crc kubenswrapper[4761]: I1201 10:55:07.113531 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:55:07 crc kubenswrapper[4761]: W1201 10:55:07.119962 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eeeff96_0cd3_4ab7_bd66_6890fd79076d.slice/crio-157d3bd1f089d1636ccbee34f7f1f420bcee364d62f92d5bd23f0a94d142862a WatchSource:0}: Error finding container 157d3bd1f089d1636ccbee34f7f1f420bcee364d62f92d5bd23f0a94d142862a: Status 404 returned error can't find the container with id 157d3bd1f089d1636ccbee34f7f1f420bcee364d62f92d5bd23f0a94d142862a Dec 01 10:55:07 crc kubenswrapper[4761]: I1201 10:55:07.140527 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd1ced1-a073-414c-832a-1d2c443342ae" path="/var/lib/kubelet/pods/2cd1ced1-a073-414c-832a-1d2c443342ae/volumes" Dec 01 10:55:07 crc kubenswrapper[4761]: I1201 10:55:07.268737 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"0eeeff96-0cd3-4ab7-bd66-6890fd79076d","Type":"ContainerStarted","Data":"157d3bd1f089d1636ccbee34f7f1f420bcee364d62f92d5bd23f0a94d142862a"} Dec 01 10:55:08 crc kubenswrapper[4761]: I1201 10:55:08.279441 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"0eeeff96-0cd3-4ab7-bd66-6890fd79076d","Type":"ContainerStarted","Data":"db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4"} Dec 01 10:55:08 crc kubenswrapper[4761]: I1201 10:55:08.280049 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"0eeeff96-0cd3-4ab7-bd66-6890fd79076d","Type":"ContainerStarted","Data":"fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6"} Dec 01 10:55:08 crc kubenswrapper[4761]: I1201 10:55:08.313808 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=2.313773597 podStartE2EDuration="2.313773597s" podCreationTimestamp="2025-12-01 10:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:55:08.306130674 +0000 UTC m=+1447.609889378" watchObservedRunningTime="2025-12-01 10:55:08.313773597 +0000 UTC m=+1447.617532271" Dec 01 10:55:13 crc kubenswrapper[4761]: I1201 10:55:13.723673 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:13 crc kubenswrapper[4761]: I1201 10:55:13.724301 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:13 crc kubenswrapper[4761]: I1201 10:55:13.775884 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:13 crc kubenswrapper[4761]: I1201 10:55:13.810156 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:13 crc kubenswrapper[4761]: I1201 10:55:13.999289 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:13 crc kubenswrapper[4761]: I1201 10:55:13.999354 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.039624 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.044447 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.044504 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.070925 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.089133 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.134056 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.355234 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.355314 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.355347 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.356426 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.356576 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:14 crc kubenswrapper[4761]: I1201 10:55:14.356672 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.258011 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.258584 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.273454 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.280194 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.344160 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.368274 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.368302 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.446272 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.448022 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.675316 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.677078 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.709237 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:16 crc kubenswrapper[4761]: I1201 10:55:16.722278 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:17 crc kubenswrapper[4761]: I1201 10:55:17.388339 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-log" containerID="cri-o://fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d" gracePeriod=30 Dec 01 10:55:17 crc kubenswrapper[4761]: I1201 10:55:17.388439 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-httpd" containerID="cri-o://21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1" gracePeriod=30 Dec 01 10:55:17 crc kubenswrapper[4761]: I1201 10:55:17.388654 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:17 crc kubenswrapper[4761]: I1201 10:55:17.389114 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:17 crc kubenswrapper[4761]: I1201 10:55:17.406049 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.151:9292/healthcheck\": EOF" Dec 01 10:55:18 crc kubenswrapper[4761]: I1201 10:55:18.402229 4761 generic.go:334] "Generic (PLEG): container finished" podID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerID="fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d" exitCode=143 Dec 01 10:55:18 crc kubenswrapper[4761]: I1201 10:55:18.402339 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d57f30e9-7c56-4ac4-9124-089ea591304f","Type":"ContainerDied","Data":"fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d"} Dec 01 10:55:19 crc kubenswrapper[4761]: I1201 10:55:19.412998 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:55:19 crc kubenswrapper[4761]: I1201 10:55:19.413041 4761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:55:19 crc kubenswrapper[4761]: I1201 10:55:19.518664 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:19 crc kubenswrapper[4761]: I1201 10:55:19.520812 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:55:19 crc kubenswrapper[4761]: I1201 10:55:19.587231 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:19 crc kubenswrapper[4761]: I1201 10:55:19.587772 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerName="glance-log" containerID="cri-o://e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf" gracePeriod=30 Dec 01 10:55:19 crc kubenswrapper[4761]: I1201 10:55:19.588095 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerName="glance-httpd" containerID="cri-o://d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b" gracePeriod=30 Dec 01 10:55:20 crc kubenswrapper[4761]: I1201 10:55:20.425756 4761 generic.go:334] "Generic (PLEG): container finished" podID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerID="e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf" exitCode=143 Dec 01 10:55:20 crc kubenswrapper[4761]: I1201 10:55:20.425902 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b16973a7-ae65-40d0-b3aa-1341cfb4df0e","Type":"ContainerDied","Data":"e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf"} Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.116013 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172159 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-sys\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172200 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-var-locks-brick\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172227 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172246 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-dev\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172277 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172309 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-config-data\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172325 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172349 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-httpd-run\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172374 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-nvme\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172347 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-dev" (OuterVolumeSpecName: "dev") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172427 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-lib-modules\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172470 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172503 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172535 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-iscsi\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172601 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgcgx\" (UniqueName: \"kubernetes.io/projected/d57f30e9-7c56-4ac4-9124-089ea591304f-kube-api-access-vgcgx\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172642 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-logs\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172683 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-run\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172714 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-scripts\") pod \"d57f30e9-7c56-4ac4-9124-089ea591304f\" (UID: \"d57f30e9-7c56-4ac4-9124-089ea591304f\") " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172710 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173097 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-sys" (OuterVolumeSpecName: "sys") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.172983 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173010 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-logs" (OuterVolumeSpecName: "logs") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173043 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-run" (OuterVolumeSpecName: "run") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173392 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173412 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173424 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173435 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173445 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173457 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173469 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173482 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57f30e9-7c56-4ac4-9124-089ea591304f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.173492 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d57f30e9-7c56-4ac4-9124-089ea591304f-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.179322 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57f30e9-7c56-4ac4-9124-089ea591304f-kube-api-access-vgcgx" (OuterVolumeSpecName: "kube-api-access-vgcgx") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "kube-api-access-vgcgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.180232 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.180562 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-scripts" (OuterVolumeSpecName: "scripts") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.192724 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.228776 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-config-data" (OuterVolumeSpecName: "config-data") pod "d57f30e9-7c56-4ac4-9124-089ea591304f" (UID: "d57f30e9-7c56-4ac4-9124-089ea591304f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.274527 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgcgx\" (UniqueName: \"kubernetes.io/projected/d57f30e9-7c56-4ac4-9124-089ea591304f-kube-api-access-vgcgx\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.274588 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.274625 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.274641 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57f30e9-7c56-4ac4-9124-089ea591304f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.274659 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.287792 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.297993 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.375851 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.375899 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.444582 4761 generic.go:334] "Generic (PLEG): container finished" podID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerID="21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1" exitCode=0 Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.444646 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d57f30e9-7c56-4ac4-9124-089ea591304f","Type":"ContainerDied","Data":"21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1"} Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.444689 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d57f30e9-7c56-4ac4-9124-089ea591304f","Type":"ContainerDied","Data":"c0c8ecdc134a16c7ca47ee9d013c8a7f98405197f3abf4550db592a600de9948"} Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.444715 4761 scope.go:117] "RemoveContainer" containerID="21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.444651 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.484541 4761 scope.go:117] "RemoveContainer" containerID="fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.484704 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.492082 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.511943 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:22 crc kubenswrapper[4761]: E1201 10:55:22.512364 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-log" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.512388 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-log" Dec 01 10:55:22 crc kubenswrapper[4761]: E1201 10:55:22.512416 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-httpd" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.512425 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-httpd" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.512625 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-httpd" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.512642 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" containerName="glance-log" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.513621 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.521167 4761 scope.go:117] "RemoveContainer" containerID="21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1" Dec 01 10:55:22 crc kubenswrapper[4761]: E1201 10:55:22.521765 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1\": container with ID starting with 21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1 not found: ID does not exist" containerID="21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.521838 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1"} err="failed to get container status \"21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1\": rpc error: code = NotFound desc = could not find container \"21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1\": container with ID starting with 21463e31d96a5d983fc45c0b6afa3d5352597826754f969a02dbd0c34f83cee1 not found: ID does not exist" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.521871 4761 scope.go:117] "RemoveContainer" containerID="fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d" Dec 01 10:55:22 crc kubenswrapper[4761]: E1201 10:55:22.522148 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d\": container with ID starting with fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d not found: ID does not exist" containerID="fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.522210 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d"} err="failed to get container status \"fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d\": rpc error: code = NotFound desc = could not find container \"fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d\": container with ID starting with fe33b2037ea296983459d697d9589511718799f4aa5bfa872d7579e501567d1d not found: ID does not exist" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.564175 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585427 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585477 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-sys\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585507 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585524 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-dev\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585561 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6ltd\" (UniqueName: \"kubernetes.io/projected/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-kube-api-access-h6ltd\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585587 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-run\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585652 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585722 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585744 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585762 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585781 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585828 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585866 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.585915 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-logs\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688524 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-logs\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688626 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688652 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-sys\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688678 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688693 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-dev\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688715 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6ltd\" (UniqueName: \"kubernetes.io/projected/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-kube-api-access-h6ltd\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688732 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-run\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688747 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688746 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-sys\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688760 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688765 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688786 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688793 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688827 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688827 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688859 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688876 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-run\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688884 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688910 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688917 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-dev\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688974 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.688931 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.689067 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.689304 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.689971 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-logs\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.692656 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.693255 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.703201 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6ltd\" (UniqueName: \"kubernetes.io/projected/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-kube-api-access-h6ltd\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.708077 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.710989 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:22 crc kubenswrapper[4761]: I1201 10:55:22.868743 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.438807 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57f30e9-7c56-4ac4-9124-089ea591304f" path="/var/lib/kubelet/pods/d57f30e9-7c56-4ac4-9124-089ea591304f/volumes" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.644349 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.753110 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833067 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msgp4\" (UniqueName: \"kubernetes.io/projected/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-kube-api-access-msgp4\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833154 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-config-data\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833178 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-sys\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833242 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-var-locks-brick\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833299 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-iscsi\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833329 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-httpd-run\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833358 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-scripts\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833379 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-lib-modules\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833400 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833430 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833465 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-dev\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833508 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-run\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833530 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-nvme\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833600 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-logs\") pod \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\" (UID: \"b16973a7-ae65-40d0-b3aa-1341cfb4df0e\") " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833763 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.833812 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-sys" (OuterVolumeSpecName: "sys") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834076 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-logs" (OuterVolumeSpecName: "logs") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834107 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-dev" (OuterVolumeSpecName: "dev") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834128 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-run" (OuterVolumeSpecName: "run") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834147 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834170 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834193 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834243 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834258 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834269 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834282 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834292 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834302 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834313 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.834499 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.841089 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-scripts" (OuterVolumeSpecName: "scripts") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.841118 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-kube-api-access-msgp4" (OuterVolumeSpecName: "kube-api-access-msgp4") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "kube-api-access-msgp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.841199 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.841507 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.876244 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-config-data" (OuterVolumeSpecName: "config-data") pod "b16973a7-ae65-40d0-b3aa-1341cfb4df0e" (UID: "b16973a7-ae65-40d0-b3aa-1341cfb4df0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.936581 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msgp4\" (UniqueName: \"kubernetes.io/projected/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-kube-api-access-msgp4\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.937099 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.937159 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.937212 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.937296 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b16973a7-ae65-40d0-b3aa-1341cfb4df0e-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.937373 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.937442 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.955943 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:55:23 crc kubenswrapper[4761]: I1201 10:55:23.957645 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.038723 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.038761 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.474433 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f","Type":"ContainerStarted","Data":"96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292"} Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.474885 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f","Type":"ContainerStarted","Data":"d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1"} Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.474907 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f","Type":"ContainerStarted","Data":"fd1278c5da1ddeaef6ba3b0a2bf952f474e3895b964d998a3fb1788241f3529a"} Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.478365 4761 generic.go:334] "Generic (PLEG): container finished" podID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerID="d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b" exitCode=0 Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.478414 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b16973a7-ae65-40d0-b3aa-1341cfb4df0e","Type":"ContainerDied","Data":"d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b"} Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.478444 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b16973a7-ae65-40d0-b3aa-1341cfb4df0e","Type":"ContainerDied","Data":"9b41ef7ea9a21fe797b7aed4bc2b5613f969ebfe943866c3c397d5c334d27be8"} Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.478470 4761 scope.go:117] "RemoveContainer" containerID="d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.478761 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.507982 4761 scope.go:117] "RemoveContainer" containerID="e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.515471 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.515408303 podStartE2EDuration="2.515408303s" podCreationTimestamp="2025-12-01 10:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:55:24.502220093 +0000 UTC m=+1463.805978727" watchObservedRunningTime="2025-12-01 10:55:24.515408303 +0000 UTC m=+1463.819166937" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.534290 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.534644 4761 scope.go:117] "RemoveContainer" containerID="d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b" Dec 01 10:55:24 crc kubenswrapper[4761]: E1201 10:55:24.535085 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b\": container with ID starting with d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b not found: ID does not exist" containerID="d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.535122 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b"} err="failed to get container status \"d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b\": rpc error: code = NotFound desc = could not find container \"d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b\": container with ID starting with d61b6b1041bc8de7874388ef587659164ede349b42463096cfbe367bd751120b not found: ID does not exist" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.535148 4761 scope.go:117] "RemoveContainer" containerID="e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf" Dec 01 10:55:24 crc kubenswrapper[4761]: E1201 10:55:24.535539 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf\": container with ID starting with e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf not found: ID does not exist" containerID="e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.535591 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf"} err="failed to get container status \"e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf\": rpc error: code = NotFound desc = could not find container \"e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf\": container with ID starting with e8ff370cb094d1734862d42d11801f504e08092f2d19cb9178e3ac106f551caf not found: ID does not exist" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.547849 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.554564 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:24 crc kubenswrapper[4761]: E1201 10:55:24.554835 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerName="glance-httpd" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.554850 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerName="glance-httpd" Dec 01 10:55:24 crc kubenswrapper[4761]: E1201 10:55:24.554864 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerName="glance-log" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.555096 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerName="glance-log" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.555312 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerName="glance-log" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.555335 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" containerName="glance-httpd" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.556712 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.570264 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.650972 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651297 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651328 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651394 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651443 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651505 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651523 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651691 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-dev\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651760 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-run\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651828 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7k7x\" (UniqueName: \"kubernetes.io/projected/0d67dd70-d073-4363-a7bd-11aabcba83f4-kube-api-access-f7k7x\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651863 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-sys\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651937 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.651976 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.652008 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753307 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753622 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753676 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753722 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753741 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753760 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753776 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753792 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753798 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753842 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753909 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753929 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.753980 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-dev\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.754007 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-run\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.754059 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7k7x\" (UniqueName: \"kubernetes.io/projected/0d67dd70-d073-4363-a7bd-11aabcba83f4-kube-api-access-f7k7x\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.754098 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-sys\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.754241 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-sys\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.754488 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.754568 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.754624 4761 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.758656 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-run\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.758764 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.758798 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.759412 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.760412 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.761653 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-dev\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.766478 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.775177 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7k7x\" (UniqueName: \"kubernetes.io/projected/0d67dd70-d073-4363-a7bd-11aabcba83f4-kube-api-access-f7k7x\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.775440 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.777593 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:24 crc kubenswrapper[4761]: I1201 10:55:24.876578 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:25 crc kubenswrapper[4761]: I1201 10:55:25.143953 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b16973a7-ae65-40d0-b3aa-1341cfb4df0e" path="/var/lib/kubelet/pods/b16973a7-ae65-40d0-b3aa-1341cfb4df0e/volumes" Dec 01 10:55:25 crc kubenswrapper[4761]: I1201 10:55:25.306530 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:55:25 crc kubenswrapper[4761]: W1201 10:55:25.313872 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d67dd70_d073_4363_a7bd_11aabcba83f4.slice/crio-78fc5a649c927d0093c99a6bd83d58d8a2c79f364d5af5d33ffeca6f9bfc6517 WatchSource:0}: Error finding container 78fc5a649c927d0093c99a6bd83d58d8a2c79f364d5af5d33ffeca6f9bfc6517: Status 404 returned error can't find the container with id 78fc5a649c927d0093c99a6bd83d58d8a2c79f364d5af5d33ffeca6f9bfc6517 Dec 01 10:55:25 crc kubenswrapper[4761]: I1201 10:55:25.489931 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0d67dd70-d073-4363-a7bd-11aabcba83f4","Type":"ContainerStarted","Data":"a53241d0bbbbc566dde68761172a676884839cc95f119ccc7b8b3beacf6ee103"} Dec 01 10:55:25 crc kubenswrapper[4761]: I1201 10:55:25.490166 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0d67dd70-d073-4363-a7bd-11aabcba83f4","Type":"ContainerStarted","Data":"78fc5a649c927d0093c99a6bd83d58d8a2c79f364d5af5d33ffeca6f9bfc6517"} Dec 01 10:55:26 crc kubenswrapper[4761]: I1201 10:55:26.501770 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0d67dd70-d073-4363-a7bd-11aabcba83f4","Type":"ContainerStarted","Data":"96901bc080f92593e11a163ba84271d12d9f508d6420445e8231eaab31906d02"} Dec 01 10:55:26 crc kubenswrapper[4761]: I1201 10:55:26.527267 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.5272465950000003 podStartE2EDuration="2.527246595s" podCreationTimestamp="2025-12-01 10:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:55:26.525748895 +0000 UTC m=+1465.829507509" watchObservedRunningTime="2025-12-01 10:55:26.527246595 +0000 UTC m=+1465.831005219" Dec 01 10:55:32 crc kubenswrapper[4761]: I1201 10:55:32.870751 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:32 crc kubenswrapper[4761]: I1201 10:55:32.871486 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:32 crc kubenswrapper[4761]: I1201 10:55:32.919983 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:32 crc kubenswrapper[4761]: I1201 10:55:32.946943 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:33 crc kubenswrapper[4761]: I1201 10:55:33.565154 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:33 crc kubenswrapper[4761]: I1201 10:55:33.565467 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:33 crc kubenswrapper[4761]: I1201 10:55:33.850482 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:55:33 crc kubenswrapper[4761]: I1201 10:55:33.850590 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:55:33 crc kubenswrapper[4761]: I1201 10:55:33.850646 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:55:33 crc kubenswrapper[4761]: I1201 10:55:33.851531 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d57787b78893daee12ca3c7dcee8cb3520b06bd08aeb0d4d1cb8f9e5545ff08"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:55:33 crc kubenswrapper[4761]: I1201 10:55:33.851689 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://7d57787b78893daee12ca3c7dcee8cb3520b06bd08aeb0d4d1cb8f9e5545ff08" gracePeriod=600 Dec 01 10:55:34 crc kubenswrapper[4761]: I1201 10:55:34.597529 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="7d57787b78893daee12ca3c7dcee8cb3520b06bd08aeb0d4d1cb8f9e5545ff08" exitCode=0 Dec 01 10:55:34 crc kubenswrapper[4761]: I1201 10:55:34.597573 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"7d57787b78893daee12ca3c7dcee8cb3520b06bd08aeb0d4d1cb8f9e5545ff08"} Dec 01 10:55:34 crc kubenswrapper[4761]: I1201 10:55:34.597924 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8"} Dec 01 10:55:34 crc kubenswrapper[4761]: I1201 10:55:34.597960 4761 scope.go:117] "RemoveContainer" containerID="ab11ccfedd2eeac2b7c4c9c4adbffd2e76c15b3f5230acf3c51b97fe7e1ab0cf" Dec 01 10:55:34 crc kubenswrapper[4761]: I1201 10:55:34.877250 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:34 crc kubenswrapper[4761]: I1201 10:55:34.879996 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:34 crc kubenswrapper[4761]: I1201 10:55:34.908441 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:34 crc kubenswrapper[4761]: I1201 10:55:34.953442 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:35 crc kubenswrapper[4761]: I1201 10:55:35.416850 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:35 crc kubenswrapper[4761]: I1201 10:55:35.418943 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:55:35 crc kubenswrapper[4761]: I1201 10:55:35.609362 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:35 crc kubenswrapper[4761]: I1201 10:55:35.609396 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:37 crc kubenswrapper[4761]: I1201 10:55:37.570370 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:37 crc kubenswrapper[4761]: I1201 10:55:37.600193 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:55:52 crc kubenswrapper[4761]: I1201 10:55:52.616280 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.070103 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kbhd6"] Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.073003 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.079158 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbhd6"] Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.210804 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-catalog-content\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.210874 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5dkq\" (UniqueName: \"kubernetes.io/projected/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-kube-api-access-q5dkq\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.210920 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-utilities\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.312444 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-utilities\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.312626 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-catalog-content\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.312788 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5dkq\" (UniqueName: \"kubernetes.io/projected/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-kube-api-access-q5dkq\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.313536 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-utilities\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.313699 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-catalog-content\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.335356 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5dkq\" (UniqueName: \"kubernetes.io/projected/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-kube-api-access-q5dkq\") pod \"community-operators-kbhd6\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.429053 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:00 crc kubenswrapper[4761]: I1201 10:56:00.913131 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbhd6"] Dec 01 10:56:00 crc kubenswrapper[4761]: W1201 10:56:00.915204 4761 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0cf877a_9bb1_4ef2_bd05_f2e0cfaaec65.slice/crio-a27caad859b42b9aaa647780f208efc8b57959cbf1aa18fa211655ceb936f16b WatchSource:0}: Error finding container a27caad859b42b9aaa647780f208efc8b57959cbf1aa18fa211655ceb936f16b: Status 404 returned error can't find the container with id a27caad859b42b9aaa647780f208efc8b57959cbf1aa18fa211655ceb936f16b Dec 01 10:56:01 crc kubenswrapper[4761]: I1201 10:56:01.870484 4761 generic.go:334] "Generic (PLEG): container finished" podID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerID="6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121" exitCode=0 Dec 01 10:56:01 crc kubenswrapper[4761]: I1201 10:56:01.870576 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbhd6" event={"ID":"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65","Type":"ContainerDied","Data":"6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121"} Dec 01 10:56:01 crc kubenswrapper[4761]: I1201 10:56:01.870895 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbhd6" event={"ID":"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65","Type":"ContainerStarted","Data":"a27caad859b42b9aaa647780f208efc8b57959cbf1aa18fa211655ceb936f16b"} Dec 01 10:56:03 crc kubenswrapper[4761]: I1201 10:56:03.893534 4761 generic.go:334] "Generic (PLEG): container finished" podID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerID="9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582" exitCode=0 Dec 01 10:56:03 crc kubenswrapper[4761]: I1201 10:56:03.893637 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbhd6" event={"ID":"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65","Type":"ContainerDied","Data":"9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582"} Dec 01 10:56:04 crc kubenswrapper[4761]: I1201 10:56:04.905238 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbhd6" event={"ID":"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65","Type":"ContainerStarted","Data":"9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250"} Dec 01 10:56:04 crc kubenswrapper[4761]: I1201 10:56:04.920802 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kbhd6" podStartSLOduration=2.185174873 podStartE2EDuration="4.920775811s" podCreationTimestamp="2025-12-01 10:56:00 +0000 UTC" firstStartedPulling="2025-12-01 10:56:01.873578224 +0000 UTC m=+1501.177336848" lastFinishedPulling="2025-12-01 10:56:04.609179152 +0000 UTC m=+1503.912937786" observedRunningTime="2025-12-01 10:56:04.919494807 +0000 UTC m=+1504.223253431" watchObservedRunningTime="2025-12-01 10:56:04.920775811 +0000 UTC m=+1504.224534475" Dec 01 10:56:08 crc kubenswrapper[4761]: I1201 10:56:08.092244 4761 scope.go:117] "RemoveContainer" containerID="25f8f90637e9fc153c00c941ce83f59a23865f5a6e37ab2bd9a74ccac2671644" Dec 01 10:56:08 crc kubenswrapper[4761]: I1201 10:56:08.122211 4761 scope.go:117] "RemoveContainer" containerID="1015a574291915b39dda2e0a34a75e0604515f889835f11ad34dd0de24b79150" Dec 01 10:56:08 crc kubenswrapper[4761]: I1201 10:56:08.176462 4761 scope.go:117] "RemoveContainer" containerID="ce3a1f2e906938bf994dcac8838933c8d5e8160fd5006a25e6c8703edcd086c8" Dec 01 10:56:10 crc kubenswrapper[4761]: I1201 10:56:10.429925 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:10 crc kubenswrapper[4761]: I1201 10:56:10.430213 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:10 crc kubenswrapper[4761]: I1201 10:56:10.511314 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:11 crc kubenswrapper[4761]: I1201 10:56:11.033784 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:11 crc kubenswrapper[4761]: I1201 10:56:11.099541 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbhd6"] Dec 01 10:56:12 crc kubenswrapper[4761]: I1201 10:56:12.987148 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kbhd6" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerName="registry-server" containerID="cri-o://9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250" gracePeriod=2 Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.446983 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.460717 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-catalog-content\") pod \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.460782 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-utilities\") pod \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.460998 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5dkq\" (UniqueName: \"kubernetes.io/projected/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-kube-api-access-q5dkq\") pod \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\" (UID: \"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65\") " Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.462922 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-utilities" (OuterVolumeSpecName: "utilities") pod "d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" (UID: "d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.471871 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-kube-api-access-q5dkq" (OuterVolumeSpecName: "kube-api-access-q5dkq") pod "d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" (UID: "d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65"). InnerVolumeSpecName "kube-api-access-q5dkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.529102 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" (UID: "d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.562895 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.562943 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.562962 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5dkq\" (UniqueName: \"kubernetes.io/projected/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65-kube-api-access-q5dkq\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.996895 4761 generic.go:334] "Generic (PLEG): container finished" podID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerID="9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250" exitCode=0 Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.996968 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbhd6" Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.996956 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbhd6" event={"ID":"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65","Type":"ContainerDied","Data":"9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250"} Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.997096 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbhd6" event={"ID":"d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65","Type":"ContainerDied","Data":"a27caad859b42b9aaa647780f208efc8b57959cbf1aa18fa211655ceb936f16b"} Dec 01 10:56:13 crc kubenswrapper[4761]: I1201 10:56:13.997134 4761 scope.go:117] "RemoveContainer" containerID="9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250" Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.038120 4761 scope.go:117] "RemoveContainer" containerID="9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582" Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.040153 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbhd6"] Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.051947 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kbhd6"] Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.076240 4761 scope.go:117] "RemoveContainer" containerID="6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121" Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.109906 4761 scope.go:117] "RemoveContainer" containerID="9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250" Dec 01 10:56:14 crc kubenswrapper[4761]: E1201 10:56:14.110506 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250\": container with ID starting with 9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250 not found: ID does not exist" containerID="9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250" Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.110561 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250"} err="failed to get container status \"9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250\": rpc error: code = NotFound desc = could not find container \"9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250\": container with ID starting with 9fb28afae26ac78ceba8bbd4e0db204501986f2bb910b59a40ca42d450159250 not found: ID does not exist" Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.110589 4761 scope.go:117] "RemoveContainer" containerID="9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582" Dec 01 10:56:14 crc kubenswrapper[4761]: E1201 10:56:14.111113 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582\": container with ID starting with 9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582 not found: ID does not exist" containerID="9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582" Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.111147 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582"} err="failed to get container status \"9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582\": rpc error: code = NotFound desc = could not find container \"9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582\": container with ID starting with 9365c9ddcdbd22837386d4be3f45bdc4a45ba8ad17c950a8b8c2700411c28582 not found: ID does not exist" Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.111175 4761 scope.go:117] "RemoveContainer" containerID="6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121" Dec 01 10:56:14 crc kubenswrapper[4761]: E1201 10:56:14.111485 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121\": container with ID starting with 6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121 not found: ID does not exist" containerID="6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121" Dec 01 10:56:14 crc kubenswrapper[4761]: I1201 10:56:14.111510 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121"} err="failed to get container status \"6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121\": rpc error: code = NotFound desc = could not find container \"6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121\": container with ID starting with 6972b8cdfb5d58bc70c7caccd93ad7ccfc02af3b607b8e8c5651a2a998e42121 not found: ID does not exist" Dec 01 10:56:15 crc kubenswrapper[4761]: I1201 10:56:15.139134 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" path="/var/lib/kubelet/pods/d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65/volumes" Dec 01 10:56:17 crc kubenswrapper[4761]: I1201 10:56:17.803852 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:56:17 crc kubenswrapper[4761]: I1201 10:56:17.804788 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerName="glance-httpd" containerID="cri-o://8f22fa8ce96c1829b6f6195e15938371b332506279c65c048124e901c586dccd" gracePeriod=30 Dec 01 10:56:17 crc kubenswrapper[4761]: I1201 10:56:17.804919 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerName="glance-log" containerID="cri-o://7799925df1befd027ab1bfc5bc44b250e99df2751f4cfb1440f3f8b226e5f020" gracePeriod=30 Dec 01 10:56:17 crc kubenswrapper[4761]: I1201 10:56:17.966533 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:56:17 crc kubenswrapper[4761]: I1201 10:56:17.966908 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerName="glance-log" containerID="cri-o://fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6" gracePeriod=30 Dec 01 10:56:17 crc kubenswrapper[4761]: I1201 10:56:17.967022 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerName="glance-httpd" containerID="cri-o://db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4" gracePeriod=30 Dec 01 10:56:18 crc kubenswrapper[4761]: I1201 10:56:18.038072 4761 generic.go:334] "Generic (PLEG): container finished" podID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerID="7799925df1befd027ab1bfc5bc44b250e99df2751f4cfb1440f3f8b226e5f020" exitCode=143 Dec 01 10:56:18 crc kubenswrapper[4761]: I1201 10:56:18.038124 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"87789d6a-0bcb-4bac-86a4-97a57045c3bc","Type":"ContainerDied","Data":"7799925df1befd027ab1bfc5bc44b250e99df2751f4cfb1440f3f8b226e5f020"} Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.067720 4761 generic.go:334] "Generic (PLEG): container finished" podID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerID="fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6" exitCode=143 Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.067821 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"0eeeff96-0cd3-4ab7-bd66-6890fd79076d","Type":"ContainerDied","Data":"fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6"} Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.297066 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-cjld5"] Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.306260 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-cjld5"] Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.332569 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancea9d5-account-delete-6rtzq"] Dec 01 10:56:19 crc kubenswrapper[4761]: E1201 10:56:19.333002 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerName="extract-content" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.333035 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerName="extract-content" Dec 01 10:56:19 crc kubenswrapper[4761]: E1201 10:56:19.333057 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerName="registry-server" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.333070 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerName="registry-server" Dec 01 10:56:19 crc kubenswrapper[4761]: E1201 10:56:19.333101 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerName="extract-utilities" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.333114 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerName="extract-utilities" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.333360 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0cf877a-9bb1-4ef2-bd05-f2e0cfaaec65" containerName="registry-server" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.334209 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.354670 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancea9d5-account-delete-6rtzq"] Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.376437 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.378781 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerName="glance-log" containerID="cri-o://a53241d0bbbbc566dde68761172a676884839cc95f119ccc7b8b3beacf6ee103" gracePeriod=30 Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.378955 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerName="glance-httpd" containerID="cri-o://96901bc080f92593e11a163ba84271d12d9f508d6420445e8231eaab31906d02" gracePeriod=30 Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.437511 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.438419 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-log" containerID="cri-o://d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1" gracePeriod=30 Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.438768 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-httpd" containerID="cri-o://96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292" gracePeriod=30 Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.510649 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mt4t\" (UniqueName: \"kubernetes.io/projected/6910d65d-0b49-4f06-ad73-164cb3dda0d4-kube-api-access-6mt4t\") pod \"glancea9d5-account-delete-6rtzq\" (UID: \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\") " pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.510730 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6910d65d-0b49-4f06-ad73-164cb3dda0d4-operator-scripts\") pod \"glancea9d5-account-delete-6rtzq\" (UID: \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\") " pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.613598 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6910d65d-0b49-4f06-ad73-164cb3dda0d4-operator-scripts\") pod \"glancea9d5-account-delete-6rtzq\" (UID: \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\") " pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.613936 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mt4t\" (UniqueName: \"kubernetes.io/projected/6910d65d-0b49-4f06-ad73-164cb3dda0d4-kube-api-access-6mt4t\") pod \"glancea9d5-account-delete-6rtzq\" (UID: \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\") " pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.614826 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6910d65d-0b49-4f06-ad73-164cb3dda0d4-operator-scripts\") pod \"glancea9d5-account-delete-6rtzq\" (UID: \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\") " pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.634644 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mt4t\" (UniqueName: \"kubernetes.io/projected/6910d65d-0b49-4f06-ad73-164cb3dda0d4-kube-api-access-6mt4t\") pod \"glancea9d5-account-delete-6rtzq\" (UID: \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\") " pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:19 crc kubenswrapper[4761]: I1201 10:56:19.658929 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:20 crc kubenswrapper[4761]: I1201 10:56:20.092879 4761 generic.go:334] "Generic (PLEG): container finished" podID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerID="a53241d0bbbbc566dde68761172a676884839cc95f119ccc7b8b3beacf6ee103" exitCode=143 Dec 01 10:56:20 crc kubenswrapper[4761]: I1201 10:56:20.092989 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0d67dd70-d073-4363-a7bd-11aabcba83f4","Type":"ContainerDied","Data":"a53241d0bbbbc566dde68761172a676884839cc95f119ccc7b8b3beacf6ee103"} Dec 01 10:56:20 crc kubenswrapper[4761]: I1201 10:56:20.095034 4761 generic.go:334] "Generic (PLEG): container finished" podID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerID="d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1" exitCode=143 Dec 01 10:56:20 crc kubenswrapper[4761]: I1201 10:56:20.095068 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f","Type":"ContainerDied","Data":"d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1"} Dec 01 10:56:20 crc kubenswrapper[4761]: I1201 10:56:20.122168 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancea9d5-account-delete-6rtzq"] Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.104994 4761 generic.go:334] "Generic (PLEG): container finished" podID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerID="8f22fa8ce96c1829b6f6195e15938371b332506279c65c048124e901c586dccd" exitCode=0 Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.105347 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"87789d6a-0bcb-4bac-86a4-97a57045c3bc","Type":"ContainerDied","Data":"8f22fa8ce96c1829b6f6195e15938371b332506279c65c048124e901c586dccd"} Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.107395 4761 generic.go:334] "Generic (PLEG): container finished" podID="6910d65d-0b49-4f06-ad73-164cb3dda0d4" containerID="8b7e9db98de6dcba21deb66f0a0dd48732e37f6cf224f0560f1a3c7f1a40c302" exitCode=0 Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.107427 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" event={"ID":"6910d65d-0b49-4f06-ad73-164cb3dda0d4","Type":"ContainerDied","Data":"8b7e9db98de6dcba21deb66f0a0dd48732e37f6cf224f0560f1a3c7f1a40c302"} Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.107457 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" event={"ID":"6910d65d-0b49-4f06-ad73-164cb3dda0d4","Type":"ContainerStarted","Data":"9e38a15c30bb5c86808b5e30cc676d0b8048b53ae00ed6c60e3759e5627b1f4a"} Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.152729 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f7736e-ab16-4070-a95c-e13f2681b346" path="/var/lib/kubelet/pods/43f7736e-ab16-4070-a95c-e13f2681b346/volumes" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.507887 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.629864 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.645633 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-sys\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.645911 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-iscsi\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646014 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-logs\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646131 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646257 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xcj9\" (UniqueName: \"kubernetes.io/projected/87789d6a-0bcb-4bac-86a4-97a57045c3bc-kube-api-access-5xcj9\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646383 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-var-locks-brick\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646513 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646664 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-nvme\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646777 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-run\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.645729 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-sys" (OuterVolumeSpecName: "sys") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646911 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-dev\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646017 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646586 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-logs" (OuterVolumeSpecName: "logs") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646630 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646760 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.646808 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-run" (OuterVolumeSpecName: "run") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.647020 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-httpd-run\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.647073 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-scripts\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.647123 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-config-data\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.647169 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-lib-modules\") pod \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\" (UID: \"87789d6a-0bcb-4bac-86a4-97a57045c3bc\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.647411 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-dev" (OuterVolumeSpecName: "dev") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648328 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648357 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648369 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648379 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648390 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648400 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648413 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648640 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.648991 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.654771 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-scripts" (OuterVolumeSpecName: "scripts") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.659037 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.660976 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.661874 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87789d6a-0bcb-4bac-86a4-97a57045c3bc-kube-api-access-5xcj9" (OuterVolumeSpecName: "kube-api-access-5xcj9") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "kube-api-access-5xcj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.704346 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-config-data" (OuterVolumeSpecName: "config-data") pod "87789d6a-0bcb-4bac-86a4-97a57045c3bc" (UID: "87789d6a-0bcb-4bac-86a4-97a57045c3bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749360 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-scripts\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749396 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-run\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749426 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-var-locks-brick\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749444 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-sys\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749491 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749521 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-lib-modules\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749573 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-logs\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749602 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-nvme\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749670 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749702 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-iscsi\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749722 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-config-data\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749740 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-dev\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749769 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cp8t\" (UniqueName: \"kubernetes.io/projected/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-kube-api-access-4cp8t\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.749791 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-httpd-run\") pod \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\" (UID: \"0eeeff96-0cd3-4ab7-bd66-6890fd79076d\") " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750145 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750138 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-sys" (OuterVolumeSpecName: "sys") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750162 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87789d6a-0bcb-4bac-86a4-97a57045c3bc-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750175 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750190 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87789d6a-0bcb-4bac-86a4-97a57045c3bc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750202 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87789d6a-0bcb-4bac-86a4-97a57045c3bc-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750219 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750231 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xcj9\" (UniqueName: \"kubernetes.io/projected/87789d6a-0bcb-4bac-86a4-97a57045c3bc-kube-api-access-5xcj9\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750168 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-run" (OuterVolumeSpecName: "run") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750208 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750283 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-dev" (OuterVolumeSpecName: "dev") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750442 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-logs" (OuterVolumeSpecName: "logs") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750469 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750858 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.750907 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.751166 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.752944 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-scripts" (OuterVolumeSpecName: "scripts") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.753007 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.754263 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.755257 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-kube-api-access-4cp8t" (OuterVolumeSpecName: "kube-api-access-4cp8t") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "kube-api-access-4cp8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.767795 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.769315 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.800208 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-config-data" (OuterVolumeSpecName: "config-data") pod "0eeeff96-0cd3-4ab7-bd66-6890fd79076d" (UID: "0eeeff96-0cd3-4ab7-bd66-6890fd79076d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851489 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851527 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851614 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851627 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851658 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851672 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851696 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851707 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851718 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cp8t\" (UniqueName: \"kubernetes.io/projected/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-kube-api-access-4cp8t\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851730 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.851740 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.852250 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.852305 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.852326 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.852346 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.852364 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0eeeff96-0cd3-4ab7-bd66-6890fd79076d-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.879013 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.892335 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.953161 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:21 crc kubenswrapper[4761]: I1201 10:56:21.953378 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.119081 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"87789d6a-0bcb-4bac-86a4-97a57045c3bc","Type":"ContainerDied","Data":"d10fba358e6249575b7a178f1abd2fb06036fcc934627b602f368dd0d6c36380"} Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.119136 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.119180 4761 scope.go:117] "RemoveContainer" containerID="8f22fa8ce96c1829b6f6195e15938371b332506279c65c048124e901c586dccd" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.123141 4761 generic.go:334] "Generic (PLEG): container finished" podID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerID="db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4" exitCode=0 Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.123246 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.123310 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"0eeeff96-0cd3-4ab7-bd66-6890fd79076d","Type":"ContainerDied","Data":"db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4"} Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.123345 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"0eeeff96-0cd3-4ab7-bd66-6890fd79076d","Type":"ContainerDied","Data":"157d3bd1f089d1636ccbee34f7f1f420bcee364d62f92d5bd23f0a94d142862a"} Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.154236 4761 scope.go:117] "RemoveContainer" containerID="7799925df1befd027ab1bfc5bc44b250e99df2751f4cfb1440f3f8b226e5f020" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.167188 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.178585 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.183437 4761 scope.go:117] "RemoveContainer" containerID="db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.185430 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.193997 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.203985 4761 scope.go:117] "RemoveContainer" containerID="fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.221077 4761 scope.go:117] "RemoveContainer" containerID="db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4" Dec 01 10:56:22 crc kubenswrapper[4761]: E1201 10:56:22.221567 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4\": container with ID starting with db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4 not found: ID does not exist" containerID="db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.221610 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4"} err="failed to get container status \"db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4\": rpc error: code = NotFound desc = could not find container \"db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4\": container with ID starting with db421a6503d3cdea518fac65c9e2c8d88d2b70cf72396cf09603fba84e5ac8e4 not found: ID does not exist" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.221636 4761 scope.go:117] "RemoveContainer" containerID="fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6" Dec 01 10:56:22 crc kubenswrapper[4761]: E1201 10:56:22.221976 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6\": container with ID starting with fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6 not found: ID does not exist" containerID="fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.221999 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6"} err="failed to get container status \"fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6\": rpc error: code = NotFound desc = could not find container \"fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6\": container with ID starting with fde5fbe8717abf2978a216707c2192bfe66dcf689de8c8c509b315cb1d3d14d6 not found: ID does not exist" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.369258 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.562429 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mt4t\" (UniqueName: \"kubernetes.io/projected/6910d65d-0b49-4f06-ad73-164cb3dda0d4-kube-api-access-6mt4t\") pod \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\" (UID: \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\") " Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.562679 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6910d65d-0b49-4f06-ad73-164cb3dda0d4-operator-scripts\") pod \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\" (UID: \"6910d65d-0b49-4f06-ad73-164cb3dda0d4\") " Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.563656 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6910d65d-0b49-4f06-ad73-164cb3dda0d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6910d65d-0b49-4f06-ad73-164cb3dda0d4" (UID: "6910d65d-0b49-4f06-ad73-164cb3dda0d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.572455 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6910d65d-0b49-4f06-ad73-164cb3dda0d4-kube-api-access-6mt4t" (OuterVolumeSpecName: "kube-api-access-6mt4t") pod "6910d65d-0b49-4f06-ad73-164cb3dda0d4" (UID: "6910d65d-0b49-4f06-ad73-164cb3dda0d4"). InnerVolumeSpecName "kube-api-access-6mt4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.664984 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6910d65d-0b49-4f06-ad73-164cb3dda0d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:22 crc kubenswrapper[4761]: I1201 10:56:22.665047 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mt4t\" (UniqueName: \"kubernetes.io/projected/6910d65d-0b49-4f06-ad73-164cb3dda0d4-kube-api-access-6mt4t\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.014806 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.072840 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-scripts\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.078250 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-scripts" (OuterVolumeSpecName: "scripts") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.132061 4761 generic.go:334] "Generic (PLEG): container finished" podID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerID="96901bc080f92593e11a163ba84271d12d9f508d6420445e8231eaab31906d02" exitCode=0 Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.134019 4761 generic.go:334] "Generic (PLEG): container finished" podID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerID="96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292" exitCode=0 Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.134125 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.135619 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.139344 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" path="/var/lib/kubelet/pods/0eeeff96-0cd3-4ab7-bd66-6890fd79076d/volumes" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.140432 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" path="/var/lib/kubelet/pods/87789d6a-0bcb-4bac-86a4-97a57045c3bc/volumes" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.144675 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0d67dd70-d073-4363-a7bd-11aabcba83f4","Type":"ContainerDied","Data":"96901bc080f92593e11a163ba84271d12d9f508d6420445e8231eaab31906d02"} Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.144733 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f","Type":"ContainerDied","Data":"96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292"} Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.144753 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f","Type":"ContainerDied","Data":"fd1278c5da1ddeaef6ba3b0a2bf952f474e3895b964d998a3fb1788241f3529a"} Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.144768 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea9d5-account-delete-6rtzq" event={"ID":"6910d65d-0b49-4f06-ad73-164cb3dda0d4","Type":"ContainerDied","Data":"9e38a15c30bb5c86808b5e30cc676d0b8048b53ae00ed6c60e3759e5627b1f4a"} Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.144782 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e38a15c30bb5c86808b5e30cc676d0b8048b53ae00ed6c60e3759e5627b1f4a" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.144821 4761 scope.go:117] "RemoveContainer" containerID="96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174467 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6ltd\" (UniqueName: \"kubernetes.io/projected/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-kube-api-access-h6ltd\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174521 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-lib-modules\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174581 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-sys\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174614 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-config-data\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174631 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-var-locks-brick\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174668 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-logs\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174694 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174737 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174759 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-httpd-run\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174786 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-dev\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174808 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-run\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174832 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-nvme\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.174857 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-iscsi\") pod \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\" (UID: \"c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.175066 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.175123 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.176391 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-dev" (OuterVolumeSpecName: "dev") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.178687 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-run" (OuterVolumeSpecName: "run") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.178795 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.178838 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.179082 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-logs" (OuterVolumeSpecName: "logs") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.179115 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.179204 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.179218 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.179225 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.179248 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-sys" (OuterVolumeSpecName: "sys") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.179932 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-kube-api-access-h6ltd" (OuterVolumeSpecName: "kube-api-access-h6ltd") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "kube-api-access-h6ltd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.196631 4761 scope.go:117] "RemoveContainer" containerID="d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.208994 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.214916 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-config-data" (OuterVolumeSpecName: "config-data") pod "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" (UID: "c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.216281 4761 scope.go:117] "RemoveContainer" containerID="96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292" Dec 01 10:56:23 crc kubenswrapper[4761]: E1201 10:56:23.216657 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292\": container with ID starting with 96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292 not found: ID does not exist" containerID="96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.216692 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292"} err="failed to get container status \"96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292\": rpc error: code = NotFound desc = could not find container \"96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292\": container with ID starting with 96df09e4b2b68fa325b4e00c029ebba0e17dc092de9cef04c871459797fcf292 not found: ID does not exist" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.216714 4761 scope.go:117] "RemoveContainer" containerID="d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1" Dec 01 10:56:23 crc kubenswrapper[4761]: E1201 10:56:23.216978 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1\": container with ID starting with d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1 not found: ID does not exist" containerID="d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.217045 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1"} err="failed to get container status \"d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1\": rpc error: code = NotFound desc = could not find container \"d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1\": container with ID starting with d2ec5d73ed38f15362eaf9f8038cfd604601bab16b071b045fe21fc39306dac1 not found: ID does not exist" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.275951 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-scripts\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.275998 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-logs\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276016 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-nvme\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276045 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-run\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276061 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-dev\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276090 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276109 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276130 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-lib-modules\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276148 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7k7x\" (UniqueName: \"kubernetes.io/projected/0d67dd70-d073-4363-a7bd-11aabcba83f4-kube-api-access-f7k7x\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276165 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-sys\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276184 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-var-locks-brick\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276205 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-iscsi\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276229 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-httpd-run\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276243 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-config-data\") pod \"0d67dd70-d073-4363-a7bd-11aabcba83f4\" (UID: \"0d67dd70-d073-4363-a7bd-11aabcba83f4\") " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276415 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276425 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276433 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276443 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6ltd\" (UniqueName: \"kubernetes.io/projected/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-kube-api-access-h6ltd\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276451 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276458 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276466 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276474 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276481 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276657 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276677 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276687 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.276695 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.277670 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.277673 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-run" (OuterVolumeSpecName: "run") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.277797 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.277844 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.277980 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-sys" (OuterVolumeSpecName: "sys") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.278044 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.278106 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-dev" (OuterVolumeSpecName: "dev") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.278208 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.278287 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-logs" (OuterVolumeSpecName: "logs") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.279945 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-scripts" (OuterVolumeSpecName: "scripts") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.280909 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.281104 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d67dd70-d073-4363-a7bd-11aabcba83f4-kube-api-access-f7k7x" (OuterVolumeSpecName: "kube-api-access-f7k7x") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "kube-api-access-f7k7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.281388 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.297403 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.298286 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.310967 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-config-data" (OuterVolumeSpecName: "config-data") pod "0d67dd70-d073-4363-a7bd-11aabcba83f4" (UID: "0d67dd70-d073-4363-a7bd-11aabcba83f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377493 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377532 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377566 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377581 4761 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377595 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7k7x\" (UniqueName: \"kubernetes.io/projected/0d67dd70-d073-4363-a7bd-11aabcba83f4-kube-api-access-f7k7x\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377609 4761 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-sys\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377620 4761 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377632 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377644 4761 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377655 4761 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377666 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377677 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d67dd70-d073-4363-a7bd-11aabcba83f4-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377689 4761 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d67dd70-d073-4363-a7bd-11aabcba83f4-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377699 4761 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377710 4761 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.377721 4761 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d67dd70-d073-4363-a7bd-11aabcba83f4-dev\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.396701 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.402055 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.465642 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.470377 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.478344 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: I1201 10:56:23.478407 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:23 crc kubenswrapper[4761]: E1201 10:56:23.606739 4761 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7d2c9c2_90a7_477b_80ea_fcdb1e8e649f.slice/crio-fd1278c5da1ddeaef6ba3b0a2bf952f474e3895b964d998a3fb1788241f3529a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7d2c9c2_90a7_477b_80ea_fcdb1e8e649f.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.151728 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.151761 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"0d67dd70-d073-4363-a7bd-11aabcba83f4","Type":"ContainerDied","Data":"78fc5a649c927d0093c99a6bd83d58d8a2c79f364d5af5d33ffeca6f9bfc6517"} Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.152262 4761 scope.go:117] "RemoveContainer" containerID="96901bc080f92593e11a163ba84271d12d9f508d6420445e8231eaab31906d02" Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.188638 4761 scope.go:117] "RemoveContainer" containerID="a53241d0bbbbc566dde68761172a676884839cc95f119ccc7b8b3beacf6ee103" Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.209299 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.222943 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.353023 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-jhwd8"] Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.358468 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-jhwd8"] Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.377890 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g"] Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.392444 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancea9d5-account-delete-6rtzq"] Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.399463 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-a9d5-account-create-update-8hw8g"] Dec 01 10:56:24 crc kubenswrapper[4761]: I1201 10:56:24.405230 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancea9d5-account-delete-6rtzq"] Dec 01 10:56:25 crc kubenswrapper[4761]: I1201 10:56:25.142952 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" path="/var/lib/kubelet/pods/0d67dd70-d073-4363-a7bd-11aabcba83f4/volumes" Dec 01 10:56:25 crc kubenswrapper[4761]: I1201 10:56:25.144159 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6910d65d-0b49-4f06-ad73-164cb3dda0d4" path="/var/lib/kubelet/pods/6910d65d-0b49-4f06-ad73-164cb3dda0d4/volumes" Dec 01 10:56:25 crc kubenswrapper[4761]: I1201 10:56:25.145027 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72e3e267-9018-433d-95ec-0456c9fef8da" path="/var/lib/kubelet/pods/72e3e267-9018-433d-95ec-0456c9fef8da/volumes" Dec 01 10:56:25 crc kubenswrapper[4761]: I1201 10:56:25.146358 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" path="/var/lib/kubelet/pods/c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f/volumes" Dec 01 10:56:25 crc kubenswrapper[4761]: I1201 10:56:25.147114 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0" path="/var/lib/kubelet/pods/d76b57f2-c1b1-43ab-bbb5-0c8411d0f8d0/volumes" Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.690884 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.691872 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-server" containerID="cri-o://8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692226 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="swift-recon-cron" containerID="cri-o://3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692273 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="rsync" containerID="cri-o://163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692327 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-expirer" containerID="cri-o://2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692363 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-updater" containerID="cri-o://c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692396 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-auditor" containerID="cri-o://b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692428 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-replicator" containerID="cri-o://cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692460 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-server" containerID="cri-o://3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692498 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-updater" containerID="cri-o://0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692527 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-auditor" containerID="cri-o://b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692573 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-replicator" containerID="cri-o://cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692605 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-server" containerID="cri-o://33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692633 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-reaper" containerID="cri-o://9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692662 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-auditor" containerID="cri-o://d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.692691 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-replicator" containerID="cri-o://b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.722143 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-rgk2z"] Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.732924 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-rgk2z"] Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.737493 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb"] Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.737737 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-httpd" containerID="cri-o://18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: I1201 10:56:33.738121 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-server" containerID="cri-o://a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf" gracePeriod=30 Dec 01 10:56:33 crc kubenswrapper[4761]: E1201 10:56:33.806632 4761 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-conmon-9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-conmon-b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-conmon-2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-conmon-0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-conmon-b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f34da4_e281_4e68_9a1f_02c97211a365.slice/crio-conmon-c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.274687 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.274999 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275011 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275025 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275034 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275042 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275050 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275060 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275067 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275075 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275083 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275093 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275101 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275109 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.274792 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275186 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275204 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275216 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275227 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275239 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275249 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275260 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275274 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275286 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275297 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275308 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275320 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.275331 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.277089 4761 generic.go:334] "Generic (PLEG): container finished" podID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerID="18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1" exitCode=0 Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.277139 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" event={"ID":"e546fe9d-d4e0-475b-a1c5-034b718ea4de","Type":"ContainerDied","Data":"18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1"} Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.703117 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.846622 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-log-httpd\") pod \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.846779 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e546fe9d-d4e0-475b-a1c5-034b718ea4de-config-data\") pod \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.846882 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjnjt\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-kube-api-access-xjnjt\") pod \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.847202 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e546fe9d-d4e0-475b-a1c5-034b718ea4de" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.847765 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") pod \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.847857 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-run-httpd\") pod \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\" (UID: \"e546fe9d-d4e0-475b-a1c5-034b718ea4de\") " Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.848302 4761 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.848671 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e546fe9d-d4e0-475b-a1c5-034b718ea4de" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.852452 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e546fe9d-d4e0-475b-a1c5-034b718ea4de" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.853135 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-kube-api-access-xjnjt" (OuterVolumeSpecName: "kube-api-access-xjnjt") pod "e546fe9d-d4e0-475b-a1c5-034b718ea4de" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de"). InnerVolumeSpecName "kube-api-access-xjnjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.896419 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e546fe9d-d4e0-475b-a1c5-034b718ea4de-config-data" (OuterVolumeSpecName: "config-data") pod "e546fe9d-d4e0-475b-a1c5-034b718ea4de" (UID: "e546fe9d-d4e0-475b-a1c5-034b718ea4de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.949772 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e546fe9d-d4e0-475b-a1c5-034b718ea4de-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.949814 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjnjt\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-kube-api-access-xjnjt\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.949827 4761 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e546fe9d-d4e0-475b-a1c5-034b718ea4de-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:34 crc kubenswrapper[4761]: I1201 10:56:34.949839 4761 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e546fe9d-d4e0-475b-a1c5-034b718ea4de-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.039840 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-j9f8v"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.049777 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-j9f8v"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.072623 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-69d7456d48-pnj5v"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.072964 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" podUID="ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" containerName="keystone-api" containerID="cri-o://7b8b65954d8565b8b5a54a846d325c91d0722d4f767aceb447fad866dfdefbc3" gracePeriod=30 Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.101027 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-75g7j"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.123398 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-75g7j"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.196266 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb8eee3-094e-4a75-b41e-5183c5f09278" path="/var/lib/kubelet/pods/1fb8eee3-094e-4a75-b41e-5183c5f09278/volumes" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.197063 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a08ed0-59f3-4e0a-84ba-02a02a886e68" path="/var/lib/kubelet/pods/37a08ed0-59f3-4e0a-84ba-02a02a886e68/volumes" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.197604 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92aa0c50-5322-495e-b8cf-6b6fc22813f8" path="/var/lib/kubelet/pods/92aa0c50-5322-495e-b8cf-6b6fc22813f8/volumes" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198166 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone529d-account-delete-pt86z"] Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198371 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198386 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198405 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198411 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198418 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198423 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198431 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198437 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198447 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198452 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198460 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6910d65d-0b49-4f06-ad73-164cb3dda0d4" containerName="mariadb-account-delete" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198465 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6910d65d-0b49-4f06-ad73-164cb3dda0d4" containerName="mariadb-account-delete" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198477 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-server" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198483 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-server" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198492 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198498 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198512 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198518 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198524 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198531 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.198540 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198560 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198682 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198694 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="6910d65d-0b49-4f06-ad73-164cb3dda0d4" containerName="mariadb-account-delete" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198709 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-server" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198716 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198724 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198733 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerName="proxy-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198739 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198747 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eeeff96-0cd3-4ab7-bd66-6890fd79076d" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198755 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="87789d6a-0bcb-4bac-86a4-97a57045c3bc" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198763 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-log" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.198770 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d67dd70-d073-4363-a7bd-11aabcba83f4" containerName="glance-httpd" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.199134 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone529d-account-delete-pt86z"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.199219 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.285619 4761 generic.go:334] "Generic (PLEG): container finished" podID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" containerID="a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf" exitCode=0 Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.285657 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" event={"ID":"e546fe9d-d4e0-475b-a1c5-034b718ea4de","Type":"ContainerDied","Data":"a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf"} Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.285710 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" event={"ID":"e546fe9d-d4e0-475b-a1c5-034b718ea4de","Type":"ContainerDied","Data":"3d056e2cb6003065446142a870491fe25a3a47d8a64b54643ad71b2b740ae7eb"} Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.285730 4761 scope.go:117] "RemoveContainer" containerID="a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.285725 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.305322 4761 scope.go:117] "RemoveContainer" containerID="18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.310473 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.316584 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-cq9vb"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.324168 4761 scope.go:117] "RemoveContainer" containerID="a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.324530 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf\": container with ID starting with a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf not found: ID does not exist" containerID="a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.324574 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf"} err="failed to get container status \"a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf\": rpc error: code = NotFound desc = could not find container \"a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf\": container with ID starting with a0825055bc3fc6bda660b720c47bff6f869095323979ce1baa2fd522fc7f74bf not found: ID does not exist" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.324600 4761 scope.go:117] "RemoveContainer" containerID="18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1" Dec 01 10:56:35 crc kubenswrapper[4761]: E1201 10:56:35.324973 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1\": container with ID starting with 18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1 not found: ID does not exist" containerID="18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.325022 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1"} err="failed to get container status \"18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1\": rpc error: code = NotFound desc = could not find container \"18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1\": container with ID starting with 18ed731027dc1e22a3546422464626d1a16f3afd02139a09142db57bb94ef0b1 not found: ID does not exist" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.366392 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45slb\" (UniqueName: \"kubernetes.io/projected/5226eb1e-f30c-4ef9-a218-d9234255a6ca-kube-api-access-45slb\") pod \"keystone529d-account-delete-pt86z\" (UID: \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\") " pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.366945 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts\") pod \"keystone529d-account-delete-pt86z\" (UID: \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\") " pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.468571 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts\") pod \"keystone529d-account-delete-pt86z\" (UID: \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\") " pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.468629 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45slb\" (UniqueName: \"kubernetes.io/projected/5226eb1e-f30c-4ef9-a218-d9234255a6ca-kube-api-access-45slb\") pod \"keystone529d-account-delete-pt86z\" (UID: \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\") " pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.469665 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts\") pod \"keystone529d-account-delete-pt86z\" (UID: \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\") " pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.488108 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45slb\" (UniqueName: \"kubernetes.io/projected/5226eb1e-f30c-4ef9-a218-d9234255a6ca-kube-api-access-45slb\") pod \"keystone529d-account-delete-pt86z\" (UID: \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\") " pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.515351 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.727242 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.734830 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.739394 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.858922 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-2" podUID="f6d62685-3430-4fba-b0ca-34ae3169f562" containerName="galera" containerID="cri-o://104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95" gracePeriod=30 Dec 01 10:56:35 crc kubenswrapper[4761]: I1201 10:56:35.982269 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone529d-account-delete-pt86z"] Dec 01 10:56:36 crc kubenswrapper[4761]: I1201 10:56:36.303456 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" event={"ID":"5226eb1e-f30c-4ef9-a218-d9234255a6ca","Type":"ContainerStarted","Data":"b1aca609619c9535aecf13c9e17df4540c913aacfb5ee1e0f89906d9408bd414"} Dec 01 10:56:36 crc kubenswrapper[4761]: I1201 10:56:36.303862 4761 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" secret="" err="secret \"galera-openstack-dockercfg-4crrg\" not found" Dec 01 10:56:36 crc kubenswrapper[4761]: I1201 10:56:36.303920 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" event={"ID":"5226eb1e-f30c-4ef9-a218-d9234255a6ca","Type":"ContainerStarted","Data":"cc699446e8ecb409543f5d5ad108b19d2d3c3d395940afa2fc2a198cfa9b5fbd"} Dec 01 10:56:36 crc kubenswrapper[4761]: I1201 10:56:36.334097 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" podStartSLOduration=1.334065002 podStartE2EDuration="1.334065002s" podCreationTimestamp="2025-12-01 10:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:56:36.327390454 +0000 UTC m=+1535.631149088" watchObservedRunningTime="2025-12-01 10:56:36.334065002 +0000 UTC m=+1535.637823646" Dec 01 10:56:36 crc kubenswrapper[4761]: I1201 10:56:36.372116 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 01 10:56:36 crc kubenswrapper[4761]: I1201 10:56:36.372463 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/memcached-0" podUID="12b286f6-2061-4845-a2ea-68fb621ff4d0" containerName="memcached" containerID="cri-o://1e547856630cfabe6fb63d68f08c7e3b67211111602a7caa2d762aa223206c45" gracePeriod=30 Dec 01 10:56:36 crc kubenswrapper[4761]: E1201 10:56:36.483906 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Dec 01 10:56:36 crc kubenswrapper[4761]: E1201 10:56:36.484024 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts podName:5226eb1e-f30c-4ef9-a218-d9234255a6ca nodeName:}" failed. No retries permitted until 2025-12-01 10:56:36.983996001 +0000 UTC m=+1536.287754645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts") pod "keystone529d-account-delete-pt86z" (UID: "5226eb1e-f30c-4ef9-a218-d9234255a6ca") : configmap "openstack-scripts" not found Dec 01 10:56:36 crc kubenswrapper[4761]: I1201 10:56:36.924883 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:56:36 crc kubenswrapper[4761]: I1201 10:56:36.946586 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 01 10:56:36 crc kubenswrapper[4761]: E1201 10:56:36.991300 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Dec 01 10:56:36 crc kubenswrapper[4761]: E1201 10:56:36.991358 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts podName:5226eb1e-f30c-4ef9-a218-d9234255a6ca nodeName:}" failed. No retries permitted until 2025-12-01 10:56:37.991344745 +0000 UTC m=+1537.295103369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts") pod "keystone529d-account-delete-pt86z" (UID: "5226eb1e-f30c-4ef9-a218-d9234255a6ca") : configmap "openstack-scripts" not found Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.092088 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-generated\") pod \"f6d62685-3430-4fba-b0ca-34ae3169f562\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.092166 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdj8\" (UniqueName: \"kubernetes.io/projected/f6d62685-3430-4fba-b0ca-34ae3169f562-kube-api-access-zgdj8\") pod \"f6d62685-3430-4fba-b0ca-34ae3169f562\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.092479 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f6d62685-3430-4fba-b0ca-34ae3169f562" (UID: "f6d62685-3430-4fba-b0ca-34ae3169f562"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.093063 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-default\") pod \"f6d62685-3430-4fba-b0ca-34ae3169f562\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.093088 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f6d62685-3430-4fba-b0ca-34ae3169f562\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.093116 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-operator-scripts\") pod \"f6d62685-3430-4fba-b0ca-34ae3169f562\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.093140 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-kolla-config\") pod \"f6d62685-3430-4fba-b0ca-34ae3169f562\" (UID: \"f6d62685-3430-4fba-b0ca-34ae3169f562\") " Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.093407 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f6d62685-3430-4fba-b0ca-34ae3169f562" (UID: "f6d62685-3430-4fba-b0ca-34ae3169f562"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.093606 4761 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.093912 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f6d62685-3430-4fba-b0ca-34ae3169f562" (UID: "f6d62685-3430-4fba-b0ca-34ae3169f562"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.094053 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6d62685-3430-4fba-b0ca-34ae3169f562" (UID: "f6d62685-3430-4fba-b0ca-34ae3169f562"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.096919 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d62685-3430-4fba-b0ca-34ae3169f562-kube-api-access-zgdj8" (OuterVolumeSpecName: "kube-api-access-zgdj8") pod "f6d62685-3430-4fba-b0ca-34ae3169f562" (UID: "f6d62685-3430-4fba-b0ca-34ae3169f562"). InnerVolumeSpecName "kube-api-access-zgdj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.101638 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "f6d62685-3430-4fba-b0ca-34ae3169f562" (UID: "f6d62685-3430-4fba-b0ca-34ae3169f562"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.135865 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e546fe9d-d4e0-475b-a1c5-034b718ea4de" path="/var/lib/kubelet/pods/e546fe9d-d4e0-475b-a1c5-034b718ea4de/volumes" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.195533 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdj8\" (UniqueName: \"kubernetes.io/projected/f6d62685-3430-4fba-b0ca-34ae3169f562-kube-api-access-zgdj8\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.195601 4761 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.195651 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.195671 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.195688 4761 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d62685-3430-4fba-b0ca-34ae3169f562-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.208592 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.297422 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.317033 4761 generic.go:334] "Generic (PLEG): container finished" podID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" containerID="b1aca609619c9535aecf13c9e17df4540c913aacfb5ee1e0f89906d9408bd414" exitCode=1 Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.317092 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" event={"ID":"5226eb1e-f30c-4ef9-a218-d9234255a6ca","Type":"ContainerDied","Data":"b1aca609619c9535aecf13c9e17df4540c913aacfb5ee1e0f89906d9408bd414"} Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.317511 4761 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" secret="" err="secret \"galera-openstack-dockercfg-4crrg\" not found" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.317570 4761 scope.go:117] "RemoveContainer" containerID="b1aca609619c9535aecf13c9e17df4540c913aacfb5ee1e0f89906d9408bd414" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.322135 4761 generic.go:334] "Generic (PLEG): container finished" podID="f6d62685-3430-4fba-b0ca-34ae3169f562" containerID="104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95" exitCode=0 Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.322166 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f6d62685-3430-4fba-b0ca-34ae3169f562","Type":"ContainerDied","Data":"104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95"} Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.322186 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f6d62685-3430-4fba-b0ca-34ae3169f562","Type":"ContainerDied","Data":"bfa39049da02d22abd2bc2bf5bddcfdee0a4e64f0955edd0fe2df67029cef680"} Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.322204 4761 scope.go:117] "RemoveContainer" containerID="104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.322273 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.390191 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.401725 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.405835 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.409199 4761 scope.go:117] "RemoveContainer" containerID="a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.446019 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/rabbitmq-server-0" podUID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" containerName="rabbitmq" containerID="cri-o://2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8" gracePeriod=604800 Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.462108 4761 scope.go:117] "RemoveContainer" containerID="104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95" Dec 01 10:56:37 crc kubenswrapper[4761]: E1201 10:56:37.465268 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95\": container with ID starting with 104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95 not found: ID does not exist" containerID="104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.465526 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95"} err="failed to get container status \"104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95\": rpc error: code = NotFound desc = could not find container \"104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95\": container with ID starting with 104b495ff408edd08bf5886963313d99525faa5d9c5cba44c9e080245ebfac95 not found: ID does not exist" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.465574 4761 scope.go:117] "RemoveContainer" containerID="a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0" Dec 01 10:56:37 crc kubenswrapper[4761]: E1201 10:56:37.468316 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0\": container with ID starting with a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0 not found: ID does not exist" containerID="a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.468368 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0"} err="failed to get container status \"a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0\": rpc error: code = NotFound desc = could not find container \"a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0\": container with ID starting with a231484af2c660bc6fce1d4bfed0cfe34a3682f1d8f79443039199067fb24ad0 not found: ID does not exist" Dec 01 10:56:37 crc kubenswrapper[4761]: I1201 10:56:37.900443 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-1" podUID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" containerName="galera" containerID="cri-o://a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb" gracePeriod=28 Dec 01 10:56:38 crc kubenswrapper[4761]: E1201 10:56:38.008919 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Dec 01 10:56:38 crc kubenswrapper[4761]: E1201 10:56:38.009395 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts podName:5226eb1e-f30c-4ef9-a218-d9234255a6ca nodeName:}" failed. No retries permitted until 2025-12-01 10:56:40.009369962 +0000 UTC m=+1539.313128586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts") pod "keystone529d-account-delete-pt86z" (UID: "5226eb1e-f30c-4ef9-a218-d9234255a6ca") : configmap "openstack-scripts" not found Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.112821 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/memcached-0" podUID="12b286f6-2061-4845-a2ea-68fb621ff4d0" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.70:11211: connect: connection refused" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.341846 4761 generic.go:334] "Generic (PLEG): container finished" podID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" containerID="a36659245ca617d899ecdb1a5452b409200c0ce60d79ca787b1695dc6a3381b5" exitCode=1 Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.341903 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" event={"ID":"5226eb1e-f30c-4ef9-a218-d9234255a6ca","Type":"ContainerDied","Data":"a36659245ca617d899ecdb1a5452b409200c0ce60d79ca787b1695dc6a3381b5"} Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.341934 4761 scope.go:117] "RemoveContainer" containerID="b1aca609619c9535aecf13c9e17df4540c913aacfb5ee1e0f89906d9408bd414" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.342311 4761 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" secret="" err="secret \"galera-openstack-dockercfg-4crrg\" not found" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.342341 4761 scope.go:117] "RemoveContainer" containerID="a36659245ca617d899ecdb1a5452b409200c0ce60d79ca787b1695dc6a3381b5" Dec 01 10:56:38 crc kubenswrapper[4761]: E1201 10:56:38.342610 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystone529d-account-delete-pt86z_glance-kuttl-tests(5226eb1e-f30c-4ef9-a218-d9234255a6ca)\"" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.344818 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md"] Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.357661 4761 generic.go:334] "Generic (PLEG): container finished" podID="ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" containerID="7b8b65954d8565b8b5a54a846d325c91d0722d4f767aceb447fad866dfdefbc3" exitCode=0 Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.357721 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" event={"ID":"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c","Type":"ContainerDied","Data":"7b8b65954d8565b8b5a54a846d325c91d0722d4f767aceb447fad866dfdefbc3"} Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.365835 4761 generic.go:334] "Generic (PLEG): container finished" podID="12b286f6-2061-4845-a2ea-68fb621ff4d0" containerID="1e547856630cfabe6fb63d68f08c7e3b67211111602a7caa2d762aa223206c45" exitCode=0 Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.365997 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" podUID="1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" containerName="manager" containerID="cri-o://9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8" gracePeriod=10 Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.366093 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"12b286f6-2061-4845-a2ea-68fb621ff4d0","Type":"ContainerDied","Data":"1e547856630cfabe6fb63d68f08c7e3b67211111602a7caa2d762aa223206c45"} Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.523724 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.591204 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-2d6pk"] Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.591397 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-2d6pk" podUID="9e884079-1d5d-40f2-a169-f2f0781bad65" containerName="registry-server" containerID="cri-o://00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8" gracePeriod=30 Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.615600 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb"] Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.624227 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-config-data\") pod \"12b286f6-2061-4845-a2ea-68fb621ff4d0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.624256 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-kolla-config\") pod \"12b286f6-2061-4845-a2ea-68fb621ff4d0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.624358 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28rfk\" (UniqueName: \"kubernetes.io/projected/12b286f6-2061-4845-a2ea-68fb621ff4d0-kube-api-access-28rfk\") pod \"12b286f6-2061-4845-a2ea-68fb621ff4d0\" (UID: \"12b286f6-2061-4845-a2ea-68fb621ff4d0\") " Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.624376 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/252e96c6965e0ae772bc85e87d7a6852f6ea164c363f40995f31a0f9e3k29mb"] Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.625715 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-config-data" (OuterVolumeSpecName: "config-data") pod "12b286f6-2061-4845-a2ea-68fb621ff4d0" (UID: "12b286f6-2061-4845-a2ea-68fb621ff4d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.626087 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "12b286f6-2061-4845-a2ea-68fb621ff4d0" (UID: "12b286f6-2061-4845-a2ea-68fb621ff4d0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.632881 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b286f6-2061-4845-a2ea-68fb621ff4d0-kube-api-access-28rfk" (OuterVolumeSpecName: "kube-api-access-28rfk") pod "12b286f6-2061-4845-a2ea-68fb621ff4d0" (UID: "12b286f6-2061-4845-a2ea-68fb621ff4d0"). InnerVolumeSpecName "kube-api-access-28rfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.743317 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28rfk\" (UniqueName: \"kubernetes.io/projected/12b286f6-2061-4845-a2ea-68fb621ff4d0-kube-api-access-28rfk\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.743359 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.743369 4761 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12b286f6-2061-4845-a2ea-68fb621ff4d0-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.854008 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.945101 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-scripts\") pod \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.945168 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-credential-keys\") pod \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.945197 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv52t\" (UniqueName: \"kubernetes.io/projected/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-kube-api-access-bv52t\") pod \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.945254 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-fernet-keys\") pod \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.945289 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-config-data\") pod \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\" (UID: \"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c\") " Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.953663 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-scripts" (OuterVolumeSpecName: "scripts") pod "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" (UID: "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.953694 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" (UID: "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.953717 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" (UID: "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.953725 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-kube-api-access-bv52t" (OuterVolumeSpecName: "kube-api-access-bv52t") pod "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" (UID: "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c"). InnerVolumeSpecName "kube-api-access-bv52t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:38 crc kubenswrapper[4761]: I1201 10:56:38.973886 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-config-data" (OuterVolumeSpecName: "config-data") pod "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" (UID: "ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.009335 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.047196 4761 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.047226 4761 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.047254 4761 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.047262 4761 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.047449 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv52t\" (UniqueName: \"kubernetes.io/projected/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c-kube-api-access-bv52t\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.103678 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.113861 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.135433 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728cc888-0261-42fc-93da-a9f5ddd03382" path="/var/lib/kubelet/pods/728cc888-0261-42fc-93da-a9f5ddd03382/volumes" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.136146 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d62685-3430-4fba-b0ca-34ae3169f562" path="/var/lib/kubelet/pods/f6d62685-3430-4fba-b0ca-34ae3169f562/volumes" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.148034 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzst\" (UniqueName: \"kubernetes.io/projected/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-kube-api-access-qkzst\") pod \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.148145 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-apiservice-cert\") pod \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.148226 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-webhook-cert\") pod \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\" (UID: \"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.154689 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" (UID: "1f389388-aa4f-4fe2-a6a5-b55a9ab9f014"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.154709 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" (UID: "1f389388-aa4f-4fe2-a6a5-b55a9ab9f014"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.154727 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-kube-api-access-qkzst" (OuterVolumeSpecName: "kube-api-access-qkzst") pod "1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" (UID: "1f389388-aa4f-4fe2-a6a5-b55a9ab9f014"). InnerVolumeSpecName "kube-api-access-qkzst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249313 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-erlang-cookie\") pod \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249394 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret\") pod \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249572 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjtsx\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-kube-api-access-vjtsx\") pod \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249639 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e07e5919-c158-40b5-a20d-6c07c7f98ecd-pod-info\") pod \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249685 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-confd\") pod \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249724 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv25n\" (UniqueName: \"kubernetes.io/projected/9e884079-1d5d-40f2-a169-f2f0781bad65-kube-api-access-zv25n\") pod \"9e884079-1d5d-40f2-a169-f2f0781bad65\" (UID: \"9e884079-1d5d-40f2-a169-f2f0781bad65\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249772 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-plugins\") pod \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249806 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf\") pod \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249824 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e07e5919-c158-40b5-a20d-6c07c7f98ecd" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.249993 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438\") pod \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\" (UID: \"e07e5919-c158-40b5-a20d-6c07c7f98ecd\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.250393 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzst\" (UniqueName: \"kubernetes.io/projected/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-kube-api-access-qkzst\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.250427 4761 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.250446 4761 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.250465 4761 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.250738 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e07e5919-c158-40b5-a20d-6c07c7f98ecd" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.251435 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e07e5919-c158-40b5-a20d-6c07c7f98ecd" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.252373 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-kube-api-access-vjtsx" (OuterVolumeSpecName: "kube-api-access-vjtsx") pod "e07e5919-c158-40b5-a20d-6c07c7f98ecd" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd"). InnerVolumeSpecName "kube-api-access-vjtsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.252488 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e07e5919-c158-40b5-a20d-6c07c7f98ecd-pod-info" (OuterVolumeSpecName: "pod-info") pod "e07e5919-c158-40b5-a20d-6c07c7f98ecd" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.252747 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e07e5919-c158-40b5-a20d-6c07c7f98ecd" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.253717 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e884079-1d5d-40f2-a169-f2f0781bad65-kube-api-access-zv25n" (OuterVolumeSpecName: "kube-api-access-zv25n") pod "9e884079-1d5d-40f2-a169-f2f0781bad65" (UID: "9e884079-1d5d-40f2-a169-f2f0781bad65"). InnerVolumeSpecName "kube-api-access-zv25n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.264740 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438" (OuterVolumeSpecName: "persistence") pod "e07e5919-c158-40b5-a20d-6c07c7f98ecd" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd"). InnerVolumeSpecName "pvc-0ea9de0e-511f-47f6-92f5-30756585a438". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.316646 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e07e5919-c158-40b5-a20d-6c07c7f98ecd" (UID: "e07e5919-c158-40b5-a20d-6c07c7f98ecd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.351540 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjtsx\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-kube-api-access-vjtsx\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.351599 4761 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e07e5919-c158-40b5-a20d-6c07c7f98ecd-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.351613 4761 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.351626 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv25n\" (UniqueName: \"kubernetes.io/projected/9e884079-1d5d-40f2-a169-f2f0781bad65-kube-api-access-zv25n\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.351637 4761 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e07e5919-c158-40b5-a20d-6c07c7f98ecd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.351651 4761 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e07e5919-c158-40b5-a20d-6c07c7f98ecd-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.351688 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0ea9de0e-511f-47f6-92f5-30756585a438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438\") on node \"crc\" " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.351704 4761 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e07e5919-c158-40b5-a20d-6c07c7f98ecd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.372028 4761 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.372304 4761 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0ea9de0e-511f-47f6-92f5-30756585a438" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438") on node "crc" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.376242 4761 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" secret="" err="secret \"galera-openstack-dockercfg-4crrg\" not found" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.376314 4761 scope.go:117] "RemoveContainer" containerID="a36659245ca617d899ecdb1a5452b409200c0ce60d79ca787b1695dc6a3381b5" Dec 01 10:56:39 crc kubenswrapper[4761]: E1201 10:56:39.376674 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystone529d-account-delete-pt86z_glance-kuttl-tests(5226eb1e-f30c-4ef9-a218-d9234255a6ca)\"" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.377465 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.377471 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-69d7456d48-pnj5v" event={"ID":"ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c","Type":"ContainerDied","Data":"430152392fa5cf915d1b9d6992b2f45c3790d7905fde375154879f3c0a12d059"} Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.378145 4761 scope.go:117] "RemoveContainer" containerID="7b8b65954d8565b8b5a54a846d325c91d0722d4f767aceb447fad866dfdefbc3" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.379926 4761 generic.go:334] "Generic (PLEG): container finished" podID="1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" containerID="9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8" exitCode=0 Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.379976 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" event={"ID":"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014","Type":"ContainerDied","Data":"9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8"} Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.380017 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" event={"ID":"1f389388-aa4f-4fe2-a6a5-b55a9ab9f014","Type":"ContainerDied","Data":"53d1917a6265c38a69b8117b30fca0de48e6f6a8c862d239632e55ad16e14af0"} Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.380027 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.383954 4761 generic.go:334] "Generic (PLEG): container finished" podID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" containerID="2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8" exitCode=0 Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.384104 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e07e5919-c158-40b5-a20d-6c07c7f98ecd","Type":"ContainerDied","Data":"2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8"} Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.384205 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e07e5919-c158-40b5-a20d-6c07c7f98ecd","Type":"ContainerDied","Data":"26b4b258e289dbd521be1c5c5dcfb22a781a54bfed246cf937793889e9b91dcb"} Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.384345 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.386596 4761 generic.go:334] "Generic (PLEG): container finished" podID="9e884079-1d5d-40f2-a169-f2f0781bad65" containerID="00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8" exitCode=0 Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.386684 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-2d6pk" event={"ID":"9e884079-1d5d-40f2-a169-f2f0781bad65","Type":"ContainerDied","Data":"00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8"} Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.386717 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-2d6pk" event={"ID":"9e884079-1d5d-40f2-a169-f2f0781bad65","Type":"ContainerDied","Data":"a8e520d3d970c6cb838982a4bba44afd7a948e5c3a681be5e97842a2fab2ef18"} Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.386795 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-2d6pk" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.392733 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"12b286f6-2061-4845-a2ea-68fb621ff4d0","Type":"ContainerDied","Data":"477c97998d9c32ce2a583e151a7412814bbb8ce7a821eb9de4f1e9a64710c52f"} Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.392824 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.404812 4761 scope.go:117] "RemoveContainer" containerID="9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.424380 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-69d7456d48-pnj5v"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.432308 4761 scope.go:117] "RemoveContainer" containerID="9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8" Dec 01 10:56:39 crc kubenswrapper[4761]: E1201 10:56:39.433419 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8\": container with ID starting with 9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8 not found: ID does not exist" containerID="9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.433516 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8"} err="failed to get container status \"9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8\": rpc error: code = NotFound desc = could not find container \"9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8\": container with ID starting with 9107da90e23d53316609952016fef45f554a8aa09f8b6dd8698c08e08dc707c8 not found: ID does not exist" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.433614 4761 scope.go:117] "RemoveContainer" containerID="2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.434463 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-69d7456d48-pnj5v"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.452864 4761 reconciler_common.go:293] "Volume detached for volume \"pvc-0ea9de0e-511f-47f6-92f5-30756585a438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ea9de0e-511f-47f6-92f5-30756585a438\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.471029 4761 scope.go:117] "RemoveContainer" containerID="922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.472633 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.485660 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/memcached-0"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.493173 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-2d6pk"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.501691 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-2d6pk"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.505967 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.511189 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7958ffffd8-wm6md"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.516316 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.520654 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.533522 4761 scope.go:117] "RemoveContainer" containerID="2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8" Dec 01 10:56:39 crc kubenswrapper[4761]: E1201 10:56:39.533961 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8\": container with ID starting with 2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8 not found: ID does not exist" containerID="2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.533994 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8"} err="failed to get container status \"2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8\": rpc error: code = NotFound desc = could not find container \"2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8\": container with ID starting with 2c8912f897fd58be99e7b710037fe8114c96fea9ef71d517a09439e064daf6d8 not found: ID does not exist" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.534015 4761 scope.go:117] "RemoveContainer" containerID="922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6" Dec 01 10:56:39 crc kubenswrapper[4761]: E1201 10:56:39.534336 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6\": container with ID starting with 922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6 not found: ID does not exist" containerID="922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.534370 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6"} err="failed to get container status \"922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6\": rpc error: code = NotFound desc = could not find container \"922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6\": container with ID starting with 922695dffabfcbfcdf689465c7871fd3a8186bc1b5a5ebd9a2a2e8221e58c7c6 not found: ID does not exist" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.534390 4761 scope.go:117] "RemoveContainer" containerID="00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.553578 4761 scope.go:117] "RemoveContainer" containerID="00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8" Dec 01 10:56:39 crc kubenswrapper[4761]: E1201 10:56:39.553942 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8\": container with ID starting with 00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8 not found: ID does not exist" containerID="00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.553972 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8"} err="failed to get container status \"00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8\": rpc error: code = NotFound desc = could not find container \"00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8\": container with ID starting with 00771849207cf9b510479c922d4c264121f6425c1f7ce6fe237c3204b9131eb8 not found: ID does not exist" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.554007 4761 scope.go:117] "RemoveContainer" containerID="1e547856630cfabe6fb63d68f08c7e3b67211111602a7caa2d762aa223206c45" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.755190 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.859161 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-generated\") pod \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.859514 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-default\") pod \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.859683 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.859801 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kolla-config\") pod \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.859942 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25q2t\" (UniqueName: \"kubernetes.io/projected/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kube-api-access-25q2t\") pod \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.860082 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-operator-scripts\") pod \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\" (UID: \"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e\") " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.859839 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" (UID: "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.860289 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" (UID: "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.860717 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" (UID: "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.862924 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" (UID: "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.864280 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kube-api-access-25q2t" (OuterVolumeSpecName: "kube-api-access-25q2t") pod "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" (UID: "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e"). InnerVolumeSpecName "kube-api-access-25q2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.873653 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" (UID: "7040d73f-f2e1-4a80-a719-8a2f8ff10f7e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.948167 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-0" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" containerName="galera" containerID="cri-o://3fe83aff52a11f78ad3da24bbf5719465d1ea5a39022fe7e63bd7d8a8ea94546" gracePeriod=26 Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.964730 4761 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.964798 4761 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.964835 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.964877 4761 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.964891 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25q2t\" (UniqueName: \"kubernetes.io/projected/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-kube-api-access-25q2t\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.964904 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:39 crc kubenswrapper[4761]: I1201 10:56:39.988636 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.065846 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:40 crc kubenswrapper[4761]: E1201 10:56:40.065932 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Dec 01 10:56:40 crc kubenswrapper[4761]: E1201 10:56:40.065979 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts podName:5226eb1e-f30c-4ef9-a218-d9234255a6ca nodeName:}" failed. No retries permitted until 2025-12-01 10:56:44.065963701 +0000 UTC m=+1543.369722315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts") pod "keystone529d-account-delete-pt86z" (UID: "5226eb1e-f30c-4ef9-a218-d9234255a6ca") : configmap "openstack-scripts" not found Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.150283 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-rstz4"] Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.158622 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-rstz4"] Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.167480 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone529d-account-delete-pt86z"] Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.176093 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-529d-account-create-update-ng6qb"] Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.181673 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-529d-account-create-update-ng6qb"] Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.403056 4761 generic.go:334] "Generic (PLEG): container finished" podID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" containerID="3fe83aff52a11f78ad3da24bbf5719465d1ea5a39022fe7e63bd7d8a8ea94546" exitCode=0 Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.403118 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec","Type":"ContainerDied","Data":"3fe83aff52a11f78ad3da24bbf5719465d1ea5a39022fe7e63bd7d8a8ea94546"} Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.406805 4761 generic.go:334] "Generic (PLEG): container finished" podID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" containerID="a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb" exitCode=0 Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.406889 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.406897 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e","Type":"ContainerDied","Data":"a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb"} Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.407029 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"7040d73f-f2e1-4a80-a719-8a2f8ff10f7e","Type":"ContainerDied","Data":"28bc08c5d8d9ebf863a0d40bf98b87e48aeb4862f6850ab63037bf059185dca9"} Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.407070 4761 scope.go:117] "RemoveContainer" containerID="a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.481023 4761 scope.go:117] "RemoveContainer" containerID="51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.509596 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.514603 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.523821 4761 scope.go:117] "RemoveContainer" containerID="a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb" Dec 01 10:56:40 crc kubenswrapper[4761]: E1201 10:56:40.524815 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb\": container with ID starting with a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb not found: ID does not exist" containerID="a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.524841 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb"} err="failed to get container status \"a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb\": rpc error: code = NotFound desc = could not find container \"a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb\": container with ID starting with a99343dd6ed5815d99789a11762b848cbefb51fbbb42cfb73b8e69a9ae974aeb not found: ID does not exist" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.524865 4761 scope.go:117] "RemoveContainer" containerID="51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9" Dec 01 10:56:40 crc kubenswrapper[4761]: E1201 10:56:40.525792 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9\": container with ID starting with 51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9 not found: ID does not exist" containerID="51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.525834 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9"} err="failed to get container status \"51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9\": rpc error: code = NotFound desc = could not find container \"51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9\": container with ID starting with 51089eeb90ed9c9033232e2de79c3c1d09a4c12e3dd3cce65c67fc82e23c9ac9 not found: ID does not exist" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.710187 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.722767 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.879704 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kolla-config\") pod \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.879752 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-generated\") pod \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.879823 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45slb\" (UniqueName: \"kubernetes.io/projected/5226eb1e-f30c-4ef9-a218-d9234255a6ca-kube-api-access-45slb\") pod \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\" (UID: \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\") " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.879858 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.879878 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kube-api-access-6rwvm\") pod \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.879903 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts\") pod \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\" (UID: \"5226eb1e-f30c-4ef9-a218-d9234255a6ca\") " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.879923 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-default\") pod \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.879951 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-operator-scripts\") pod \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\" (UID: \"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec\") " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.880463 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" (UID: "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.880721 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5226eb1e-f30c-4ef9-a218-d9234255a6ca" (UID: "5226eb1e-f30c-4ef9-a218-d9234255a6ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.880853 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" (UID: "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.881126 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" (UID: "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.880854 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" (UID: "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.884255 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5226eb1e-f30c-4ef9-a218-d9234255a6ca-kube-api-access-45slb" (OuterVolumeSpecName: "kube-api-access-45slb") pod "5226eb1e-f30c-4ef9-a218-d9234255a6ca" (UID: "5226eb1e-f30c-4ef9-a218-d9234255a6ca"). InnerVolumeSpecName "kube-api-access-45slb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.884422 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kube-api-access-6rwvm" (OuterVolumeSpecName: "kube-api-access-6rwvm") pod "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" (UID: "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec"). InnerVolumeSpecName "kube-api-access-6rwvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.889491 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" (UID: "7ebdf60a-b95f-4443-9bcc-452c3d2da2ec"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.981584 4761 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.981898 4761 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.981921 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45slb\" (UniqueName: \"kubernetes.io/projected/5226eb1e-f30c-4ef9-a218-d9234255a6ca-kube-api-access-45slb\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.981971 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.981991 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rwvm\" (UniqueName: \"kubernetes.io/projected/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-kube-api-access-6rwvm\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.982011 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5226eb1e-f30c-4ef9-a218-d9234255a6ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.982031 4761 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:40 crc kubenswrapper[4761]: I1201 10:56:40.982051 4761 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.004186 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.084155 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.144052 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b286f6-2061-4845-a2ea-68fb621ff4d0" path="/var/lib/kubelet/pods/12b286f6-2061-4845-a2ea-68fb621ff4d0/volumes" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.145320 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" path="/var/lib/kubelet/pods/1f389388-aa4f-4fe2-a6a5-b55a9ab9f014/volumes" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.146828 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" path="/var/lib/kubelet/pods/7040d73f-f2e1-4a80-a719-8a2f8ff10f7e/volumes" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.149066 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e884079-1d5d-40f2-a169-f2f0781bad65" path="/var/lib/kubelet/pods/9e884079-1d5d-40f2-a169-f2f0781bad65/volumes" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.150383 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" path="/var/lib/kubelet/pods/e07e5919-c158-40b5-a20d-6c07c7f98ecd/volumes" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.151641 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7cb9538-4868-41e0-b3b8-67c31f777482" path="/var/lib/kubelet/pods/e7cb9538-4868-41e0-b3b8-67c31f777482/volumes" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.153730 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ed070d-7777-4cf1-b0c7-007059a78cc3" path="/var/lib/kubelet/pods/e8ed070d-7777-4cf1-b0c7-007059a78cc3/volumes" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.154776 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" path="/var/lib/kubelet/pods/ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c/volumes" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.437430 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" event={"ID":"5226eb1e-f30c-4ef9-a218-d9234255a6ca","Type":"ContainerDied","Data":"cc699446e8ecb409543f5d5ad108b19d2d3c3d395940afa2fc2a198cfa9b5fbd"} Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.437635 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone529d-account-delete-pt86z" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.437834 4761 scope.go:117] "RemoveContainer" containerID="a36659245ca617d899ecdb1a5452b409200c0ce60d79ca787b1695dc6a3381b5" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.495174 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone529d-account-delete-pt86z"] Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.498947 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7ebdf60a-b95f-4443-9bcc-452c3d2da2ec","Type":"ContainerDied","Data":"ebd2c2235ca97bdba671327e9120d44524e621192083fda8f3f10608bc95eb6a"} Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.498999 4761 scope.go:117] "RemoveContainer" containerID="3fe83aff52a11f78ad3da24bbf5719465d1ea5a39022fe7e63bd7d8a8ea94546" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.499148 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.503667 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone529d-account-delete-pt86z"] Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.538301 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.544096 4761 scope.go:117] "RemoveContainer" containerID="4f1441564040cbecd73e6e6f1d1b54ca71dcaedb0bd78ccd80643395dc5d1b70" Dec 01 10:56:41 crc kubenswrapper[4761]: I1201 10:56:41.559998 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.136488 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" path="/var/lib/kubelet/pods/5226eb1e-f30c-4ef9-a218-d9234255a6ca/volumes" Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.137343 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" path="/var/lib/kubelet/pods/7ebdf60a-b95f-4443-9bcc-452c3d2da2ec/volumes" Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.425598 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg"] Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.425850 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" podUID="1476658c-4234-4688-9c90-25ec6ba4a55d" containerName="manager" containerID="cri-o://757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a" gracePeriod=10 Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.639596 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-stmvr"] Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.639821 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-stmvr" podUID="c3153419-6c29-4301-a072-acfcee97b630" containerName="registry-server" containerID="cri-o://d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673" gracePeriod=30 Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.664777 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb"] Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.675961 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bzl8pb"] Dec 01 10:56:43 crc kubenswrapper[4761]: I1201 10:56:43.885353 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.027424 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.035703 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-webhook-cert\") pod \"1476658c-4234-4688-9c90-25ec6ba4a55d\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.035776 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfv5z\" (UniqueName: \"kubernetes.io/projected/1476658c-4234-4688-9c90-25ec6ba4a55d-kube-api-access-lfv5z\") pod \"1476658c-4234-4688-9c90-25ec6ba4a55d\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.035870 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-apiservice-cert\") pod \"1476658c-4234-4688-9c90-25ec6ba4a55d\" (UID: \"1476658c-4234-4688-9c90-25ec6ba4a55d\") " Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.053758 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "1476658c-4234-4688-9c90-25ec6ba4a55d" (UID: "1476658c-4234-4688-9c90-25ec6ba4a55d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.053899 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1476658c-4234-4688-9c90-25ec6ba4a55d-kube-api-access-lfv5z" (OuterVolumeSpecName: "kube-api-access-lfv5z") pod "1476658c-4234-4688-9c90-25ec6ba4a55d" (UID: "1476658c-4234-4688-9c90-25ec6ba4a55d"). InnerVolumeSpecName "kube-api-access-lfv5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.053979 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "1476658c-4234-4688-9c90-25ec6ba4a55d" (UID: "1476658c-4234-4688-9c90-25ec6ba4a55d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.137183 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvm78\" (UniqueName: \"kubernetes.io/projected/c3153419-6c29-4301-a072-acfcee97b630-kube-api-access-lvm78\") pod \"c3153419-6c29-4301-a072-acfcee97b630\" (UID: \"c3153419-6c29-4301-a072-acfcee97b630\") " Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.137574 4761 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.137591 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfv5z\" (UniqueName: \"kubernetes.io/projected/1476658c-4234-4688-9c90-25ec6ba4a55d-kube-api-access-lfv5z\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.137605 4761 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1476658c-4234-4688-9c90-25ec6ba4a55d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.139696 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3153419-6c29-4301-a072-acfcee97b630-kube-api-access-lvm78" (OuterVolumeSpecName: "kube-api-access-lvm78") pod "c3153419-6c29-4301-a072-acfcee97b630" (UID: "c3153419-6c29-4301-a072-acfcee97b630"). InnerVolumeSpecName "kube-api-access-lvm78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.238879 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvm78\" (UniqueName: \"kubernetes.io/projected/c3153419-6c29-4301-a072-acfcee97b630-kube-api-access-lvm78\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.527306 4761 generic.go:334] "Generic (PLEG): container finished" podID="1476658c-4234-4688-9c90-25ec6ba4a55d" containerID="757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a" exitCode=0 Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.527369 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.527429 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" event={"ID":"1476658c-4234-4688-9c90-25ec6ba4a55d","Type":"ContainerDied","Data":"757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a"} Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.527477 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" event={"ID":"1476658c-4234-4688-9c90-25ec6ba4a55d","Type":"ContainerDied","Data":"0e1574aa22d11c87d6beb4e3941f239d5ebdee810a8649ac46a431924d2bafaa"} Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.527507 4761 scope.go:117] "RemoveContainer" containerID="757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.529568 4761 generic.go:334] "Generic (PLEG): container finished" podID="c3153419-6c29-4301-a072-acfcee97b630" containerID="d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673" exitCode=0 Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.529606 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-stmvr" event={"ID":"c3153419-6c29-4301-a072-acfcee97b630","Type":"ContainerDied","Data":"d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673"} Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.529617 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-stmvr" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.529630 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-stmvr" event={"ID":"c3153419-6c29-4301-a072-acfcee97b630","Type":"ContainerDied","Data":"e8c00ab72024931b0fc319fe2e709529335cd613fece2c96474e37b850d1a02e"} Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.557174 4761 scope.go:117] "RemoveContainer" containerID="757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a" Dec 01 10:56:44 crc kubenswrapper[4761]: E1201 10:56:44.558462 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a\": container with ID starting with 757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a not found: ID does not exist" containerID="757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.558505 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a"} err="failed to get container status \"757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a\": rpc error: code = NotFound desc = could not find container \"757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a\": container with ID starting with 757022eda9ecb54c61a5d8cc1726a4ef120e9688e21c394eb17df8520b4ad69a not found: ID does not exist" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.558529 4761 scope.go:117] "RemoveContainer" containerID="d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.596347 4761 scope.go:117] "RemoveContainer" containerID="d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673" Dec 01 10:56:44 crc kubenswrapper[4761]: E1201 10:56:44.596906 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673\": container with ID starting with d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673 not found: ID does not exist" containerID="d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.598104 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673"} err="failed to get container status \"d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673\": rpc error: code = NotFound desc = could not find container \"d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673\": container with ID starting with d2de9eddf282fe62f13dd3f648c798bbc3215334feb91ee19fab371853200673 not found: ID does not exist" Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.618724 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg"] Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.624213 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg"] Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.629182 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-stmvr"] Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.636798 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-stmvr"] Dec 01 10:56:44 crc kubenswrapper[4761]: I1201 10:56:44.642352 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-547f66dd48-gbkdg" podUID="1476658c-4234-4688-9c90-25ec6ba4a55d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.88:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:56:45 crc kubenswrapper[4761]: I1201 10:56:45.135884 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1476658c-4234-4688-9c90-25ec6ba4a55d" path="/var/lib/kubelet/pods/1476658c-4234-4688-9c90-25ec6ba4a55d/volumes" Dec 01 10:56:45 crc kubenswrapper[4761]: I1201 10:56:45.137320 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495e601e-4706-49fe-b2c9-5d7eb14bf566" path="/var/lib/kubelet/pods/495e601e-4706-49fe-b2c9-5d7eb14bf566/volumes" Dec 01 10:56:45 crc kubenswrapper[4761]: I1201 10:56:45.138677 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3153419-6c29-4301-a072-acfcee97b630" path="/var/lib/kubelet/pods/c3153419-6c29-4301-a072-acfcee97b630/volumes" Dec 01 10:56:45 crc kubenswrapper[4761]: I1201 10:56:45.801873 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5"] Dec 01 10:56:45 crc kubenswrapper[4761]: I1201 10:56:45.802145 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" podUID="28ed4847-2217-4e5d-8d1b-7006e6098116" containerName="manager" containerID="cri-o://ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b" gracePeriod=10 Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.086106 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-txt4b"] Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.086320 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-txt4b" podUID="6c0796c3-509a-4117-8973-0a740ba1dc2f" containerName="registry-server" containerID="cri-o://93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303" gracePeriod=30 Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.118695 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5"] Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.125040 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/49c083020ae5dfe237b73a6c6b807501660a323f061d7879268c43a121nlnj5"] Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.352479 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.479294 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-webhook-cert\") pod \"28ed4847-2217-4e5d-8d1b-7006e6098116\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.479376 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz44k\" (UniqueName: \"kubernetes.io/projected/28ed4847-2217-4e5d-8d1b-7006e6098116-kube-api-access-qz44k\") pod \"28ed4847-2217-4e5d-8d1b-7006e6098116\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.479427 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-apiservice-cert\") pod \"28ed4847-2217-4e5d-8d1b-7006e6098116\" (UID: \"28ed4847-2217-4e5d-8d1b-7006e6098116\") " Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.484442 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "28ed4847-2217-4e5d-8d1b-7006e6098116" (UID: "28ed4847-2217-4e5d-8d1b-7006e6098116"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.484538 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ed4847-2217-4e5d-8d1b-7006e6098116-kube-api-access-qz44k" (OuterVolumeSpecName: "kube-api-access-qz44k") pod "28ed4847-2217-4e5d-8d1b-7006e6098116" (UID: "28ed4847-2217-4e5d-8d1b-7006e6098116"). InnerVolumeSpecName "kube-api-access-qz44k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.484884 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "28ed4847-2217-4e5d-8d1b-7006e6098116" (UID: "28ed4847-2217-4e5d-8d1b-7006e6098116"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.492007 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.551371 4761 generic.go:334] "Generic (PLEG): container finished" podID="6c0796c3-509a-4117-8973-0a740ba1dc2f" containerID="93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303" exitCode=0 Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.551435 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-txt4b" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.551470 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-txt4b" event={"ID":"6c0796c3-509a-4117-8973-0a740ba1dc2f","Type":"ContainerDied","Data":"93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303"} Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.551679 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-txt4b" event={"ID":"6c0796c3-509a-4117-8973-0a740ba1dc2f","Type":"ContainerDied","Data":"e4a087d67363101254f06fb716372580355d038303ce841b399351f28f6cd648"} Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.551715 4761 scope.go:117] "RemoveContainer" containerID="93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.553385 4761 generic.go:334] "Generic (PLEG): container finished" podID="28ed4847-2217-4e5d-8d1b-7006e6098116" containerID="ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b" exitCode=0 Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.553424 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.553433 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" event={"ID":"28ed4847-2217-4e5d-8d1b-7006e6098116","Type":"ContainerDied","Data":"ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b"} Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.553459 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5" event={"ID":"28ed4847-2217-4e5d-8d1b-7006e6098116","Type":"ContainerDied","Data":"38ce51711ed420ea38464fd7d11c752ad63986bbff365c4f09f51405785a859f"} Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.580778 4761 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.580809 4761 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28ed4847-2217-4e5d-8d1b-7006e6098116-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.580821 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz44k\" (UniqueName: \"kubernetes.io/projected/28ed4847-2217-4e5d-8d1b-7006e6098116-kube-api-access-qz44k\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.583394 4761 scope.go:117] "RemoveContainer" containerID="93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303" Dec 01 10:56:46 crc kubenswrapper[4761]: E1201 10:56:46.583882 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303\": container with ID starting with 93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303 not found: ID does not exist" containerID="93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.583915 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303"} err="failed to get container status \"93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303\": rpc error: code = NotFound desc = could not find container \"93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303\": container with ID starting with 93f049dab5067efe9499edb8ce8d9e0b5cfecd947bfa0cb290d38f9ef1b68303 not found: ID does not exist" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.583934 4761 scope.go:117] "RemoveContainer" containerID="ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.595088 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5"] Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.608023 4761 scope.go:117] "RemoveContainer" containerID="ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b" Dec 01 10:56:46 crc kubenswrapper[4761]: E1201 10:56:46.608576 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b\": container with ID starting with ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b not found: ID does not exist" containerID="ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.608629 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b"} err="failed to get container status \"ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b\": rpc error: code = NotFound desc = could not find container \"ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b\": container with ID starting with ab375e51731307ccd3003ef34db6f92657537a2c28d199bf43d95eb309d2602b not found: ID does not exist" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.608995 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-86757b45cc-8hmf5"] Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.681383 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ttfv\" (UniqueName: \"kubernetes.io/projected/6c0796c3-509a-4117-8973-0a740ba1dc2f-kube-api-access-9ttfv\") pod \"6c0796c3-509a-4117-8973-0a740ba1dc2f\" (UID: \"6c0796c3-509a-4117-8973-0a740ba1dc2f\") " Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.683961 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0796c3-509a-4117-8973-0a740ba1dc2f-kube-api-access-9ttfv" (OuterVolumeSpecName: "kube-api-access-9ttfv") pod "6c0796c3-509a-4117-8973-0a740ba1dc2f" (UID: "6c0796c3-509a-4117-8973-0a740ba1dc2f"). InnerVolumeSpecName "kube-api-access-9ttfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.783515 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ttfv\" (UniqueName: \"kubernetes.io/projected/6c0796c3-509a-4117-8973-0a740ba1dc2f-kube-api-access-9ttfv\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.905745 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-txt4b"] Dec 01 10:56:46 crc kubenswrapper[4761]: I1201 10:56:46.911964 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-txt4b"] Dec 01 10:56:47 crc kubenswrapper[4761]: I1201 10:56:47.141975 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ed4847-2217-4e5d-8d1b-7006e6098116" path="/var/lib/kubelet/pods/28ed4847-2217-4e5d-8d1b-7006e6098116/volumes" Dec 01 10:56:47 crc kubenswrapper[4761]: I1201 10:56:47.143172 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f85f97-cf13-45ad-9b74-50272f00a8be" path="/var/lib/kubelet/pods/33f85f97-cf13-45ad-9b74-50272f00a8be/volumes" Dec 01 10:56:47 crc kubenswrapper[4761]: I1201 10:56:47.144424 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c0796c3-509a-4117-8973-0a740ba1dc2f" path="/var/lib/kubelet/pods/6c0796c3-509a-4117-8973-0a740ba1dc2f/volumes" Dec 01 10:56:48 crc kubenswrapper[4761]: I1201 10:56:48.867351 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq"] Dec 01 10:56:48 crc kubenswrapper[4761]: I1201 10:56:48.867925 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" podUID="f9355b38-86ff-42a4-80ea-c34b540953df" containerName="operator" containerID="cri-o://c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57" gracePeriod=10 Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.199818 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qms5r"] Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.200305 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" podUID="3d9174c7-4f65-40de-941a-4e10bf61eb65" containerName="registry-server" containerID="cri-o://e1393851c143b809f1edcc774d597693f6939e74fd12531a0323617674654839" gracePeriod=30 Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.221215 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh"] Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.234446 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590s22nh"] Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.363010 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.503197 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6bd7\" (UniqueName: \"kubernetes.io/projected/f9355b38-86ff-42a4-80ea-c34b540953df-kube-api-access-f6bd7\") pod \"f9355b38-86ff-42a4-80ea-c34b540953df\" (UID: \"f9355b38-86ff-42a4-80ea-c34b540953df\") " Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.508735 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9355b38-86ff-42a4-80ea-c34b540953df-kube-api-access-f6bd7" (OuterVolumeSpecName: "kube-api-access-f6bd7") pod "f9355b38-86ff-42a4-80ea-c34b540953df" (UID: "f9355b38-86ff-42a4-80ea-c34b540953df"). InnerVolumeSpecName "kube-api-access-f6bd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.580913 4761 generic.go:334] "Generic (PLEG): container finished" podID="3d9174c7-4f65-40de-941a-4e10bf61eb65" containerID="e1393851c143b809f1edcc774d597693f6939e74fd12531a0323617674654839" exitCode=0 Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.580952 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" event={"ID":"3d9174c7-4f65-40de-941a-4e10bf61eb65","Type":"ContainerDied","Data":"e1393851c143b809f1edcc774d597693f6939e74fd12531a0323617674654839"} Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.581014 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" event={"ID":"3d9174c7-4f65-40de-941a-4e10bf61eb65","Type":"ContainerDied","Data":"d54cf7814de50f24e03dabf8fa5d76c20af11db2880eb6153241716515d4f2cc"} Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.581029 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d54cf7814de50f24e03dabf8fa5d76c20af11db2880eb6153241716515d4f2cc" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.581885 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.582561 4761 generic.go:334] "Generic (PLEG): container finished" podID="f9355b38-86ff-42a4-80ea-c34b540953df" containerID="c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57" exitCode=0 Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.582594 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" event={"ID":"f9355b38-86ff-42a4-80ea-c34b540953df","Type":"ContainerDied","Data":"c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57"} Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.582611 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" event={"ID":"f9355b38-86ff-42a4-80ea-c34b540953df","Type":"ContainerDied","Data":"de507c70c37e0cdba746b0533b65e42d77e188123f664adfb5f94a8a6c813335"} Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.582629 4761 scope.go:117] "RemoveContainer" containerID="c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.582732 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.605130 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6bd7\" (UniqueName: \"kubernetes.io/projected/f9355b38-86ff-42a4-80ea-c34b540953df-kube-api-access-f6bd7\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.613231 4761 scope.go:117] "RemoveContainer" containerID="c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57" Dec 01 10:56:49 crc kubenswrapper[4761]: E1201 10:56:49.614240 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57\": container with ID starting with c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57 not found: ID does not exist" containerID="c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.614313 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57"} err="failed to get container status \"c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57\": rpc error: code = NotFound desc = could not find container \"c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57\": container with ID starting with c509a983500a6eef012d14dcd70b2582f4d27701c507828329f8118bddeefb57 not found: ID does not exist" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.636829 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq"] Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.642077 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-plxnq"] Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.706033 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxdwl\" (UniqueName: \"kubernetes.io/projected/3d9174c7-4f65-40de-941a-4e10bf61eb65-kube-api-access-bxdwl\") pod \"3d9174c7-4f65-40de-941a-4e10bf61eb65\" (UID: \"3d9174c7-4f65-40de-941a-4e10bf61eb65\") " Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.709323 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9174c7-4f65-40de-941a-4e10bf61eb65-kube-api-access-bxdwl" (OuterVolumeSpecName: "kube-api-access-bxdwl") pod "3d9174c7-4f65-40de-941a-4e10bf61eb65" (UID: "3d9174c7-4f65-40de-941a-4e10bf61eb65"). InnerVolumeSpecName "kube-api-access-bxdwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:49 crc kubenswrapper[4761]: I1201 10:56:49.807228 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxdwl\" (UniqueName: \"kubernetes.io/projected/3d9174c7-4f65-40de-941a-4e10bf61eb65-kube-api-access-bxdwl\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:50 crc kubenswrapper[4761]: I1201 10:56:50.592365 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-qms5r" Dec 01 10:56:50 crc kubenswrapper[4761]: I1201 10:56:50.635508 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qms5r"] Dec 01 10:56:50 crc kubenswrapper[4761]: I1201 10:56:50.641847 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qms5r"] Dec 01 10:56:51 crc kubenswrapper[4761]: I1201 10:56:51.138052 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9174c7-4f65-40de-941a-4e10bf61eb65" path="/var/lib/kubelet/pods/3d9174c7-4f65-40de-941a-4e10bf61eb65/volumes" Dec 01 10:56:51 crc kubenswrapper[4761]: I1201 10:56:51.138951 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac62f65-15c5-4ba6-881b-0af0ccf341b6" path="/var/lib/kubelet/pods/dac62f65-15c5-4ba6-881b-0af0ccf341b6/volumes" Dec 01 10:56:51 crc kubenswrapper[4761]: I1201 10:56:51.139719 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9355b38-86ff-42a4-80ea-c34b540953df" path="/var/lib/kubelet/pods/f9355b38-86ff-42a4-80ea-c34b540953df/volumes" Dec 01 10:56:52 crc kubenswrapper[4761]: I1201 10:56:52.870443 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.153:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:56:52 crc kubenswrapper[4761]: I1201 10:56:52.870625 4761 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="c7d2c9c2-90a7-477b-80ea-fcdb1e8e649f" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 01 10:56:53 crc kubenswrapper[4761]: I1201 10:56:53.439669 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl"] Dec 01 10:56:53 crc kubenswrapper[4761]: I1201 10:56:53.439940 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerName="manager" containerID="cri-o://a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6" gracePeriod=10 Dec 01 10:56:53 crc kubenswrapper[4761]: I1201 10:56:53.439999 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerName="kube-rbac-proxy" containerID="cri-o://8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df" gracePeriod=10 Dec 01 10:56:53 crc kubenswrapper[4761]: I1201 10:56:53.698004 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-vgg5w"] Dec 01 10:56:53 crc kubenswrapper[4761]: I1201 10:56:53.698270 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-vgg5w" podUID="66d6a565-82b4-42d3-b803-9ff143c8a8bc" containerName="registry-server" containerID="cri-o://5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7" gracePeriod=30 Dec 01 10:56:53 crc kubenswrapper[4761]: I1201 10:56:53.744700 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr"] Dec 01 10:56:53 crc kubenswrapper[4761]: I1201 10:56:53.749004 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dvqmrr"] Dec 01 10:56:53 crc kubenswrapper[4761]: E1201 10:56:53.993191 4761 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7 is running failed: container process not found" containerID="5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 10:56:53 crc kubenswrapper[4761]: E1201 10:56:53.993518 4761 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7 is running failed: container process not found" containerID="5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 10:56:53 crc kubenswrapper[4761]: E1201 10:56:53.993807 4761 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7 is running failed: container process not found" containerID="5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 10:56:53 crc kubenswrapper[4761]: E1201 10:56:53.993874 4761 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7 is running failed: container process not found" probeType="Readiness" pod="openstack-operators/infra-operator-index-vgg5w" podUID="66d6a565-82b4-42d3-b803-9ff143c8a8bc" containerName="registry-server" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.184090 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.355756 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.369960 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grb6x\" (UniqueName: \"kubernetes.io/projected/66d6a565-82b4-42d3-b803-9ff143c8a8bc-kube-api-access-grb6x\") pod \"66d6a565-82b4-42d3-b803-9ff143c8a8bc\" (UID: \"66d6a565-82b4-42d3-b803-9ff143c8a8bc\") " Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.376127 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d6a565-82b4-42d3-b803-9ff143c8a8bc-kube-api-access-grb6x" (OuterVolumeSpecName: "kube-api-access-grb6x") pod "66d6a565-82b4-42d3-b803-9ff143c8a8bc" (UID: "66d6a565-82b4-42d3-b803-9ff143c8a8bc"). InnerVolumeSpecName "kube-api-access-grb6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.470863 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-apiservice-cert\") pod \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.471321 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbk4d\" (UniqueName: \"kubernetes.io/projected/e1a6426b-c4ef-4874-8f48-a59d830ae08d-kube-api-access-xbk4d\") pod \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.471623 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-webhook-cert\") pod \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\" (UID: \"e1a6426b-c4ef-4874-8f48-a59d830ae08d\") " Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.472923 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grb6x\" (UniqueName: \"kubernetes.io/projected/66d6a565-82b4-42d3-b803-9ff143c8a8bc-kube-api-access-grb6x\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.474700 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "e1a6426b-c4ef-4874-8f48-a59d830ae08d" (UID: "e1a6426b-c4ef-4874-8f48-a59d830ae08d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.476149 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a6426b-c4ef-4874-8f48-a59d830ae08d-kube-api-access-xbk4d" (OuterVolumeSpecName: "kube-api-access-xbk4d") pod "e1a6426b-c4ef-4874-8f48-a59d830ae08d" (UID: "e1a6426b-c4ef-4874-8f48-a59d830ae08d"). InnerVolumeSpecName "kube-api-access-xbk4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.477844 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "e1a6426b-c4ef-4874-8f48-a59d830ae08d" (UID: "e1a6426b-c4ef-4874-8f48-a59d830ae08d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.574527 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbk4d\" (UniqueName: \"kubernetes.io/projected/e1a6426b-c4ef-4874-8f48-a59d830ae08d-kube-api-access-xbk4d\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.574798 4761 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.574861 4761 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1a6426b-c4ef-4874-8f48-a59d830ae08d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.631223 4761 generic.go:334] "Generic (PLEG): container finished" podID="66d6a565-82b4-42d3-b803-9ff143c8a8bc" containerID="5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7" exitCode=0 Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.631287 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-vgg5w" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.631287 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-vgg5w" event={"ID":"66d6a565-82b4-42d3-b803-9ff143c8a8bc","Type":"ContainerDied","Data":"5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7"} Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.631793 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-vgg5w" event={"ID":"66d6a565-82b4-42d3-b803-9ff143c8a8bc","Type":"ContainerDied","Data":"a450c237459da4afcffc39341017f6ac90adfaa578a6d8a8bed88bbc488d8154"} Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.631820 4761 scope.go:117] "RemoveContainer" containerID="5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.635127 4761 generic.go:334] "Generic (PLEG): container finished" podID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerID="8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df" exitCode=0 Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.635150 4761 generic.go:334] "Generic (PLEG): container finished" podID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerID="a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6" exitCode=0 Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.635168 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" event={"ID":"e1a6426b-c4ef-4874-8f48-a59d830ae08d","Type":"ContainerDied","Data":"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df"} Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.635202 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" event={"ID":"e1a6426b-c4ef-4874-8f48-a59d830ae08d","Type":"ContainerDied","Data":"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6"} Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.635213 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" event={"ID":"e1a6426b-c4ef-4874-8f48-a59d830ae08d","Type":"ContainerDied","Data":"fe92fd9f609911f8d1b987f9c4037b864e1288bf46fcb9a82ab26d5ec33ac184"} Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.635260 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.660341 4761 scope.go:117] "RemoveContainer" containerID="5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7" Dec 01 10:56:54 crc kubenswrapper[4761]: E1201 10:56:54.661108 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7\": container with ID starting with 5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7 not found: ID does not exist" containerID="5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.661151 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7"} err="failed to get container status \"5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7\": rpc error: code = NotFound desc = could not find container \"5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7\": container with ID starting with 5c1647758f2fea23b50012bbaedee49508803a1690d57c2c3b8c507669055dc7 not found: ID does not exist" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.661176 4761 scope.go:117] "RemoveContainer" containerID="8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.688303 4761 scope.go:117] "RemoveContainer" containerID="a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.691029 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl"] Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.700792 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cf567c5-99jtl"] Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.710963 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-vgg5w"] Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.711109 4761 scope.go:117] "RemoveContainer" containerID="8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df" Dec 01 10:56:54 crc kubenswrapper[4761]: E1201 10:56:54.712000 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df\": container with ID starting with 8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df not found: ID does not exist" containerID="8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.712032 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df"} err="failed to get container status \"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df\": rpc error: code = NotFound desc = could not find container \"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df\": container with ID starting with 8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df not found: ID does not exist" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.712059 4761 scope.go:117] "RemoveContainer" containerID="a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6" Dec 01 10:56:54 crc kubenswrapper[4761]: E1201 10:56:54.712309 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6\": container with ID starting with a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6 not found: ID does not exist" containerID="a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.712337 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6"} err="failed to get container status \"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6\": rpc error: code = NotFound desc = could not find container \"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6\": container with ID starting with a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6 not found: ID does not exist" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.712358 4761 scope.go:117] "RemoveContainer" containerID="8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.712580 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df"} err="failed to get container status \"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df\": rpc error: code = NotFound desc = could not find container \"8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df\": container with ID starting with 8dafb8ed9e51af7220c27748b09ba76ede41847c82c3eebb7dd7419d42e062df not found: ID does not exist" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.712598 4761 scope.go:117] "RemoveContainer" containerID="a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.712907 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6"} err="failed to get container status \"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6\": rpc error: code = NotFound desc = could not find container \"a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6\": container with ID starting with a9fdd710a153669705196fe4d7c4547b8d11960df1a8777351b42e21d5d865d6 not found: ID does not exist" Dec 01 10:56:54 crc kubenswrapper[4761]: I1201 10:56:54.714059 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-vgg5w"] Dec 01 10:56:55 crc kubenswrapper[4761]: I1201 10:56:55.140202 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d6a565-82b4-42d3-b803-9ff143c8a8bc" path="/var/lib/kubelet/pods/66d6a565-82b4-42d3-b803-9ff143c8a8bc/volumes" Dec 01 10:56:55 crc kubenswrapper[4761]: I1201 10:56:55.141378 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f57fa1-0cb4-4df5-8675-7789b6e83ef7" path="/var/lib/kubelet/pods/73f57fa1-0cb4-4df5-8675-7789b6e83ef7/volumes" Dec 01 10:56:55 crc kubenswrapper[4761]: I1201 10:56:55.142451 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" path="/var/lib/kubelet/pods/e1a6426b-c4ef-4874-8f48-a59d830ae08d/volumes" Dec 01 10:56:55 crc kubenswrapper[4761]: I1201 10:56:55.874598 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q"] Dec 01 10:56:55 crc kubenswrapper[4761]: I1201 10:56:55.874821 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" podUID="635c1195-66ca-4595-8f7d-cb66e37db30f" containerName="manager" containerID="cri-o://a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20" gracePeriod=10 Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.127025 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-mgkj6"] Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.127244 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-mgkj6" podUID="594f3896-fe41-4a3b-878d-849501100194" containerName="registry-server" containerID="cri-o://bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549" gracePeriod=30 Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.194706 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn"] Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.200590 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534fnsvqn"] Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.347492 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.511046 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-webhook-cert\") pod \"635c1195-66ca-4595-8f7d-cb66e37db30f\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.511088 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq4tc\" (UniqueName: \"kubernetes.io/projected/635c1195-66ca-4595-8f7d-cb66e37db30f-kube-api-access-tq4tc\") pod \"635c1195-66ca-4595-8f7d-cb66e37db30f\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.511198 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-apiservice-cert\") pod \"635c1195-66ca-4595-8f7d-cb66e37db30f\" (UID: \"635c1195-66ca-4595-8f7d-cb66e37db30f\") " Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.515598 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635c1195-66ca-4595-8f7d-cb66e37db30f-kube-api-access-tq4tc" (OuterVolumeSpecName: "kube-api-access-tq4tc") pod "635c1195-66ca-4595-8f7d-cb66e37db30f" (UID: "635c1195-66ca-4595-8f7d-cb66e37db30f"). InnerVolumeSpecName "kube-api-access-tq4tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.519809 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "635c1195-66ca-4595-8f7d-cb66e37db30f" (UID: "635c1195-66ca-4595-8f7d-cb66e37db30f"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.522595 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "635c1195-66ca-4595-8f7d-cb66e37db30f" (UID: "635c1195-66ca-4595-8f7d-cb66e37db30f"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.525688 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.613161 4761 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.613218 4761 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/635c1195-66ca-4595-8f7d-cb66e37db30f-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.613247 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq4tc\" (UniqueName: \"kubernetes.io/projected/635c1195-66ca-4595-8f7d-cb66e37db30f-kube-api-access-tq4tc\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.655799 4761 generic.go:334] "Generic (PLEG): container finished" podID="635c1195-66ca-4595-8f7d-cb66e37db30f" containerID="a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20" exitCode=0 Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.655862 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" event={"ID":"635c1195-66ca-4595-8f7d-cb66e37db30f","Type":"ContainerDied","Data":"a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20"} Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.655927 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" event={"ID":"635c1195-66ca-4595-8f7d-cb66e37db30f","Type":"ContainerDied","Data":"ce748ec2908b38c3c7af51aa5f424832ae2eed33eee71d294cdaf03dd3520f81"} Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.655947 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.655958 4761 scope.go:117] "RemoveContainer" containerID="a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.659643 4761 generic.go:334] "Generic (PLEG): container finished" podID="594f3896-fe41-4a3b-878d-849501100194" containerID="bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549" exitCode=0 Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.659707 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-mgkj6" event={"ID":"594f3896-fe41-4a3b-878d-849501100194","Type":"ContainerDied","Data":"bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549"} Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.659745 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-mgkj6" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.659746 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-mgkj6" event={"ID":"594f3896-fe41-4a3b-878d-849501100194","Type":"ContainerDied","Data":"edcb653c1a3e8b97e3b281985c260df8dbdadc87604c9f7e0d203ec7b420071e"} Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.683196 4761 scope.go:117] "RemoveContainer" containerID="a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20" Dec 01 10:56:56 crc kubenswrapper[4761]: E1201 10:56:56.683878 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20\": container with ID starting with a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20 not found: ID does not exist" containerID="a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.684009 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20"} err="failed to get container status \"a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20\": rpc error: code = NotFound desc = could not find container \"a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20\": container with ID starting with a6048d5fbf4edb53949091b1042f545574e4fc5fb05168bf587efe9657bbdb20 not found: ID does not exist" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.684073 4761 scope.go:117] "RemoveContainer" containerID="bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.693098 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q"] Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.706074 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67d6f98b9-pxc6q"] Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.715249 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmcd4\" (UniqueName: \"kubernetes.io/projected/594f3896-fe41-4a3b-878d-849501100194-kube-api-access-kmcd4\") pod \"594f3896-fe41-4a3b-878d-849501100194\" (UID: \"594f3896-fe41-4a3b-878d-849501100194\") " Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.720474 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594f3896-fe41-4a3b-878d-849501100194-kube-api-access-kmcd4" (OuterVolumeSpecName: "kube-api-access-kmcd4") pod "594f3896-fe41-4a3b-878d-849501100194" (UID: "594f3896-fe41-4a3b-878d-849501100194"). InnerVolumeSpecName "kube-api-access-kmcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.728908 4761 scope.go:117] "RemoveContainer" containerID="bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549" Dec 01 10:56:56 crc kubenswrapper[4761]: E1201 10:56:56.729359 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549\": container with ID starting with bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549 not found: ID does not exist" containerID="bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.729397 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549"} err="failed to get container status \"bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549\": rpc error: code = NotFound desc = could not find container \"bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549\": container with ID starting with bf9e3923d6e506064ffaec7192b49dac22bd414c452bf0121e08587c9cfa5549 not found: ID does not exist" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.816768 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmcd4\" (UniqueName: \"kubernetes.io/projected/594f3896-fe41-4a3b-878d-849501100194-kube-api-access-kmcd4\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.991559 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-mgkj6"] Dec 01 10:56:56 crc kubenswrapper[4761]: I1201 10:56:56.996627 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-mgkj6"] Dec 01 10:56:57 crc kubenswrapper[4761]: I1201 10:56:57.139915 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da4d646-d8b2-40dc-8a0e-9f66b3567d3f" path="/var/lib/kubelet/pods/4da4d646-d8b2-40dc-8a0e-9f66b3567d3f/volumes" Dec 01 10:56:57 crc kubenswrapper[4761]: I1201 10:56:57.141132 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594f3896-fe41-4a3b-878d-849501100194" path="/var/lib/kubelet/pods/594f3896-fe41-4a3b-878d-849501100194/volumes" Dec 01 10:56:57 crc kubenswrapper[4761]: I1201 10:56:57.142065 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="635c1195-66ca-4595-8f7d-cb66e37db30f" path="/var/lib/kubelet/pods/635c1195-66ca-4595-8f7d-cb66e37db30f/volumes" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.655069 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.656513 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fdrx\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-kube-api-access-4fdrx\") pod \"20f34da4-e281-4e68-9a1f-02c97211a365\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.656599 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-cache\") pod \"20f34da4-e281-4e68-9a1f-02c97211a365\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.657354 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-cache" (OuterVolumeSpecName: "cache") pod "20f34da4-e281-4e68-9a1f-02c97211a365" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.663028 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-kube-api-access-4fdrx" (OuterVolumeSpecName: "kube-api-access-4fdrx") pod "20f34da4-e281-4e68-9a1f-02c97211a365" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365"). InnerVolumeSpecName "kube-api-access-4fdrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.746700 4761 generic.go:334] "Generic (PLEG): container finished" podID="20f34da4-e281-4e68-9a1f-02c97211a365" containerID="3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e" exitCode=137 Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.746788 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e"} Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.746866 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.747054 4761 scope.go:117] "RemoveContainer" containerID="3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.747041 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"20f34da4-e281-4e68-9a1f-02c97211a365","Type":"ContainerDied","Data":"9816df5aba32f3957e11a874249ec2b76f4fe6b99bec96944b56afc83080b688"} Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.758052 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"20f34da4-e281-4e68-9a1f-02c97211a365\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.758133 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-lock\") pod \"20f34da4-e281-4e68-9a1f-02c97211a365\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.758403 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") pod \"20f34da4-e281-4e68-9a1f-02c97211a365\" (UID: \"20f34da4-e281-4e68-9a1f-02c97211a365\") " Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.758756 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-lock" (OuterVolumeSpecName: "lock") pod "20f34da4-e281-4e68-9a1f-02c97211a365" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.758881 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fdrx\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-kube-api-access-4fdrx\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.758922 4761 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-cache\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.763765 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "20f34da4-e281-4e68-9a1f-02c97211a365" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.768798 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "swift") pod "20f34da4-e281-4e68-9a1f-02c97211a365" (UID: "20f34da4-e281-4e68-9a1f-02c97211a365"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.779804 4761 scope.go:117] "RemoveContainer" containerID="163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.828148 4761 scope.go:117] "RemoveContainer" containerID="2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.842742 4761 scope.go:117] "RemoveContainer" containerID="c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.856835 4761 scope.go:117] "RemoveContainer" containerID="b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.860194 4761 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20f34da4-e281-4e68-9a1f-02c97211a365-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.860238 4761 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.860252 4761 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20f34da4-e281-4e68-9a1f-02c97211a365-lock\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.873838 4761 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.875712 4761 scope.go:117] "RemoveContainer" containerID="cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.896189 4761 scope.go:117] "RemoveContainer" containerID="3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.922628 4761 scope.go:117] "RemoveContainer" containerID="0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.940336 4761 scope.go:117] "RemoveContainer" containerID="b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.960998 4761 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.964253 4761 scope.go:117] "RemoveContainer" containerID="cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b" Dec 01 10:57:04 crc kubenswrapper[4761]: I1201 10:57:04.987423 4761 scope.go:117] "RemoveContainer" containerID="33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.015026 4761 scope.go:117] "RemoveContainer" containerID="9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.050450 4761 scope.go:117] "RemoveContainer" containerID="d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.084825 4761 scope.go:117] "RemoveContainer" containerID="b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.091621 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.097599 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.101492 4761 scope.go:117] "RemoveContainer" containerID="8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.117353 4761 scope.go:117] "RemoveContainer" containerID="3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.117847 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e\": container with ID starting with 3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e not found: ID does not exist" containerID="3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.117892 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e"} err="failed to get container status \"3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e\": rpc error: code = NotFound desc = could not find container \"3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e\": container with ID starting with 3edaab82be56a6c4094f26537bbd15de87f18b20f83811e5686686af74ee8f5e not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.117921 4761 scope.go:117] "RemoveContainer" containerID="163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.118526 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341\": container with ID starting with 163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341 not found: ID does not exist" containerID="163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.118574 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341"} err="failed to get container status \"163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341\": rpc error: code = NotFound desc = could not find container \"163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341\": container with ID starting with 163000514a0b3708c985de418e100a8d278804b567e4128eb794de05cb8b3341 not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.118596 4761 scope.go:117] "RemoveContainer" containerID="2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.119050 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a\": container with ID starting with 2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a not found: ID does not exist" containerID="2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.119093 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a"} err="failed to get container status \"2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a\": rpc error: code = NotFound desc = could not find container \"2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a\": container with ID starting with 2838f20106bca8607807c41ac0ab01dfe76bc15e1c49953b5e1fe9eef197324a not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.119122 4761 scope.go:117] "RemoveContainer" containerID="c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.119502 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b\": container with ID starting with c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b not found: ID does not exist" containerID="c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.119528 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b"} err="failed to get container status \"c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b\": rpc error: code = NotFound desc = could not find container \"c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b\": container with ID starting with c281030039e2df74a4bd111a60ea9a7424262b09ee3798e12b55a2012e1cc90b not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.119542 4761 scope.go:117] "RemoveContainer" containerID="b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.120047 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09\": container with ID starting with b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09 not found: ID does not exist" containerID="b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.120074 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09"} err="failed to get container status \"b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09\": rpc error: code = NotFound desc = could not find container \"b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09\": container with ID starting with b388580811d7bddc19c078c4b11918d65483c3bff2be412eb857c180922e6e09 not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.120091 4761 scope.go:117] "RemoveContainer" containerID="cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.120507 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577\": container with ID starting with cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577 not found: ID does not exist" containerID="cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.120528 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577"} err="failed to get container status \"cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577\": rpc error: code = NotFound desc = could not find container \"cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577\": container with ID starting with cf32f862fd231ee4b9306cfc70591df12508ede989b86f1c96e830c42ba2b577 not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.120541 4761 scope.go:117] "RemoveContainer" containerID="3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.120831 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f\": container with ID starting with 3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f not found: ID does not exist" containerID="3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.120860 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f"} err="failed to get container status \"3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f\": rpc error: code = NotFound desc = could not find container \"3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f\": container with ID starting with 3d864ee2427000207570658495ec609e5dfde200f5e486b8b7525103b5a6c48f not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.120876 4761 scope.go:117] "RemoveContainer" containerID="0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.121091 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187\": container with ID starting with 0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187 not found: ID does not exist" containerID="0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.121118 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187"} err="failed to get container status \"0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187\": rpc error: code = NotFound desc = could not find container \"0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187\": container with ID starting with 0ff90a7f06233772412ffd348a31666828c89cd3518dbcf5e19ec18d8063c187 not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.121134 4761 scope.go:117] "RemoveContainer" containerID="b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.121654 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75\": container with ID starting with b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75 not found: ID does not exist" containerID="b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.121675 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75"} err="failed to get container status \"b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75\": rpc error: code = NotFound desc = could not find container \"b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75\": container with ID starting with b67c7627d2f98c533f04e3e7e2e35e3c10d464d124d98d0e2fe9024356bada75 not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.121690 4761 scope.go:117] "RemoveContainer" containerID="cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.121890 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b\": container with ID starting with cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b not found: ID does not exist" containerID="cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.121905 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b"} err="failed to get container status \"cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b\": rpc error: code = NotFound desc = could not find container \"cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b\": container with ID starting with cda1b1aaa55aa9e2704ff7bd0075411fce1c7d3493ec85a22d4d9d4d9c03486b not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.121917 4761 scope.go:117] "RemoveContainer" containerID="33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.122118 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835\": container with ID starting with 33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835 not found: ID does not exist" containerID="33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.122141 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835"} err="failed to get container status \"33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835\": rpc error: code = NotFound desc = could not find container \"33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835\": container with ID starting with 33f9860b0c7fbbb00040dfbe884f30b0b35cc226faba0e3090b4d33cda395835 not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.122156 4761 scope.go:117] "RemoveContainer" containerID="9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.122698 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479\": container with ID starting with 9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479 not found: ID does not exist" containerID="9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.122728 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479"} err="failed to get container status \"9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479\": rpc error: code = NotFound desc = could not find container \"9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479\": container with ID starting with 9117907f1d39a97636f8c59c2253b61792f55ffc7917974ef26a258bdaef4479 not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.122748 4761 scope.go:117] "RemoveContainer" containerID="d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.123146 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1\": container with ID starting with d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1 not found: ID does not exist" containerID="d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.123168 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1"} err="failed to get container status \"d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1\": rpc error: code = NotFound desc = could not find container \"d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1\": container with ID starting with d658e4de43e37223e51eefcf67aabc0488610a496b389991876cd67d6900c9c1 not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.123182 4761 scope.go:117] "RemoveContainer" containerID="b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.123388 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa\": container with ID starting with b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa not found: ID does not exist" containerID="b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.123409 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa"} err="failed to get container status \"b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa\": rpc error: code = NotFound desc = could not find container \"b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa\": container with ID starting with b55fcf1ed6922d39178867eef4fd72140f48c5a842dceed828406a16ad62b0fa not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.123427 4761 scope.go:117] "RemoveContainer" containerID="8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a" Dec 01 10:57:05 crc kubenswrapper[4761]: E1201 10:57:05.123686 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a\": container with ID starting with 8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a not found: ID does not exist" containerID="8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.123710 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a"} err="failed to get container status \"8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a\": rpc error: code = NotFound desc = could not find container \"8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a\": container with ID starting with 8e7e1a938925574a7d6f4b5089aa304d5742ef5143bab7b406d4e68a978d2a7a not found: ID does not exist" Dec 01 10:57:05 crc kubenswrapper[4761]: I1201 10:57:05.136108 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" path="/var/lib/kubelet/pods/20f34da4-e281-4e68-9a1f-02c97211a365/volumes" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.273867 4761 scope.go:117] "RemoveContainer" containerID="69eb9966894da3611f83c4534d92de43cf90aa0ad307fa6a9cc8088c4ad4e0de" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.304098 4761 scope.go:117] "RemoveContainer" containerID="f5ec29afb6dd6fe8f692c8ed2cfce9766e84d8e73ddda4deceb3b5762919db99" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.348645 4761 scope.go:117] "RemoveContainer" containerID="14d87bcaf79fd9b037204d8c67015069d18134a279a232db8b249f038dcfe77a" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.384420 4761 scope.go:117] "RemoveContainer" containerID="0ea1e4467b349a028e34592014f2f13e8aae19f3b904182cd25d19d762464255" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.411875 4761 scope.go:117] "RemoveContainer" containerID="7fb4b133070a8796a792130396df9c4d5bb21afea337b75723314c53b582ac41" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.451534 4761 scope.go:117] "RemoveContainer" containerID="e1393851c143b809f1edcc774d597693f6939e74fd12531a0323617674654839" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.482457 4761 scope.go:117] "RemoveContainer" containerID="633a7783582d09f72ccc52631860fc64bdc3acb7073a38836ff5ffe307c570ab" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.511660 4761 scope.go:117] "RemoveContainer" containerID="bc35d40c83fc14655090b1b0ad4e92e1ceca84962cc0e9572df8655009fe3c37" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.551640 4761 scope.go:117] "RemoveContainer" containerID="e8bb3ec6cdf14940b9b45e0dc53f2cd8adc1388a4630439431039fbfdd12d7c3" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.575722 4761 scope.go:117] "RemoveContainer" containerID="9c09540d8e0bd15d479bf5b04e66d385edb8f004ff3375ea3ecaec4b21a1f9a3" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.592360 4761 scope.go:117] "RemoveContainer" containerID="02e78242e3db8585218da5a7e36f422dd1b206345fc629c97582aa60fce53a6b" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.609404 4761 scope.go:117] "RemoveContainer" containerID="1107dd2bfb9ba0fe64f35e4825249099b9e1e6e8c05193d90a9876468d91c4b3" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.624356 4761 scope.go:117] "RemoveContainer" containerID="dd9570c7685191bb06ff2423acc8abe2c844a69c78ce987216c76f7dbd9e047e" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.636769 4761 scope.go:117] "RemoveContainer" containerID="0c3dd15463622322d202883f390010b0d27b1b8df8db8d2e2be033ebbf98e8f9" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.649942 4761 scope.go:117] "RemoveContainer" containerID="4117e8a839798d1372a47df36bd00d5bf0ba6c25e23380deda04eae81864d79b" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.665375 4761 scope.go:117] "RemoveContainer" containerID="d76403c1539eb5763266fa719c150d040730bba08c60e3759b09e5da1b68c987" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.693008 4761 scope.go:117] "RemoveContainer" containerID="db435de52d06795a3379839fca9be76cdafb208e1fb066917b90badef32e43b6" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.712376 4761 scope.go:117] "RemoveContainer" containerID="a44d4cca0b0bf03637c4a7ffe033f74179ed74f2deb924b28c83bc4b34bcff99" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.725791 4761 scope.go:117] "RemoveContainer" containerID="33afbbd18a2056b8f427cb5fbddf15f01c522bce670fd708cca0f8529787a3ae" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.744122 4761 scope.go:117] "RemoveContainer" containerID="cc47b488b9d91db956df0c9f77285bdd2e4fa3ee82efad0b5fcdd75d5c1dfb85" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.758890 4761 scope.go:117] "RemoveContainer" containerID="3087bb4c94619cfbd60bbfe760dfc2942ab2b1baaf310a4da24660c880808b2f" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.773141 4761 scope.go:117] "RemoveContainer" containerID="dca4bb5c40a5e876c00219be44d21a85c9667bb8bafa04d62073cd32e7c9a895" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.788226 4761 scope.go:117] "RemoveContainer" containerID="3775b7f5b5da5a4e95f3199cc37879688beb36c61d95e7ab45874b69e3505484" Dec 01 10:57:08 crc kubenswrapper[4761]: I1201 10:57:08.821420 4761 scope.go:117] "RemoveContainer" containerID="a2a9b29e337986a18d243476a36bbcd121a99b7291bcdb0a7aaef90de906d052" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.512484 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l5js8/must-gather-sqjdn"] Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513172 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" containerName="mysql-bootstrap" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513190 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" containerName="mysql-bootstrap" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513206 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" containerName="setup-container" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513216 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" containerName="setup-container" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513232 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0796c3-509a-4117-8973-0a740ba1dc2f" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513244 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0796c3-509a-4117-8973-0a740ba1dc2f" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513256 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594f3896-fe41-4a3b-878d-849501100194" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513266 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="594f3896-fe41-4a3b-878d-849501100194" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513282 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d62685-3430-4fba-b0ca-34ae3169f562" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513292 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d62685-3430-4fba-b0ca-34ae3169f562" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513308 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513318 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513333 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513342 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513355 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513365 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513377 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635c1195-66ca-4595-8f7d-cb66e37db30f" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513387 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="635c1195-66ca-4595-8f7d-cb66e37db30f" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513401 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" containerName="keystone-api" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513410 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" containerName="keystone-api" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513428 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513437 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513448 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b286f6-2061-4845-a2ea-68fb621ff4d0" containerName="memcached" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513459 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b286f6-2061-4845-a2ea-68fb621ff4d0" containerName="memcached" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513473 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-expirer" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513483 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-expirer" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513502 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513510 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513529 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513539 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513615 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" containerName="mysql-bootstrap" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513628 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" containerName="mysql-bootstrap" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513639 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3153419-6c29-4301-a072-acfcee97b630" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513650 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3153419-6c29-4301-a072-acfcee97b630" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513666 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513676 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513688 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9174c7-4f65-40de-941a-4e10bf61eb65" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513697 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9174c7-4f65-40de-941a-4e10bf61eb65" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513712 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513722 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513733 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9355b38-86ff-42a4-80ea-c34b540953df" containerName="operator" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513743 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9355b38-86ff-42a4-80ea-c34b540953df" containerName="operator" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513757 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513767 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513778 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513788 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513798 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="rsync" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513807 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="rsync" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513821 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d6a565-82b4-42d3-b803-9ff143c8a8bc" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513830 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d6a565-82b4-42d3-b803-9ff143c8a8bc" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513842 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513853 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513870 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-reaper" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513879 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-reaper" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513889 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1476658c-4234-4688-9c90-25ec6ba4a55d" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513899 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1476658c-4234-4688-9c90-25ec6ba4a55d" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513914 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" containerName="mariadb-account-delete" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513924 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" containerName="mariadb-account-delete" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513940 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" containerName="mariadb-account-delete" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513950 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" containerName="mariadb-account-delete" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513965 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerName="kube-rbac-proxy" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513976 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerName="kube-rbac-proxy" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.513989 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d62685-3430-4fba-b0ca-34ae3169f562" containerName="mysql-bootstrap" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.513999 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d62685-3430-4fba-b0ca-34ae3169f562" containerName="mysql-bootstrap" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.514012 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ed4847-2217-4e5d-8d1b-7006e6098116" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514023 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ed4847-2217-4e5d-8d1b-7006e6098116" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.514033 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e884079-1d5d-40f2-a169-f2f0781bad65" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514045 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e884079-1d5d-40f2-a169-f2f0781bad65" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.514060 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-updater" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514071 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-updater" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.514087 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514097 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-server" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.514113 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-updater" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514123 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-updater" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.514134 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514144 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.514157 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="swift-recon-cron" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514166 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="swift-recon-cron" Dec 01 10:57:09 crc kubenswrapper[4761]: E1201 10:57:09.514184 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" containerName="rabbitmq" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514194 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" containerName="rabbitmq" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514350 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="635c1195-66ca-4595-8f7d-cb66e37db30f" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514369 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebdf60a-b95f-4443-9bcc-452c3d2da2ec" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514384 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514401 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" containerName="mariadb-account-delete" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514414 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6e2a9d-eafc-42c7-8e81-9d5c5760c81c" containerName="keystone-api" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514430 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-updater" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514441 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-expirer" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514455 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ed4847-2217-4e5d-8d1b-7006e6098116" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514472 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514486 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514501 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-updater" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514513 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07e5919-c158-40b5-a20d-6c07c7f98ecd" containerName="rabbitmq" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514526 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="594f3896-fe41-4a3b-878d-849501100194" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514544 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9174c7-4f65-40de-941a-4e10bf61eb65" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514581 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514599 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="container-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514614 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d6a565-82b4-42d3-b803-9ff143c8a8bc" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514627 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-auditor" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514641 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0796c3-509a-4117-8973-0a740ba1dc2f" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514663 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="object-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514678 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b286f6-2061-4845-a2ea-68fb621ff4d0" containerName="memcached" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514694 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerName="kube-rbac-proxy" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514708 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-reaper" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514718 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-replicator" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514733 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1476658c-4234-4688-9c90-25ec6ba4a55d" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514746 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9355b38-86ff-42a4-80ea-c34b540953df" containerName="operator" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514760 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="rsync" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514774 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d62685-3430-4fba-b0ca-34ae3169f562" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514785 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e884079-1d5d-40f2-a169-f2f0781bad65" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514799 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f389388-aa4f-4fe2-a6a5-b55a9ab9f014" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514809 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="7040d73f-f2e1-4a80-a719-8a2f8ff10f7e" containerName="galera" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514820 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3153419-6c29-4301-a072-acfcee97b630" containerName="registry-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514833 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="account-server" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514848 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f34da4-e281-4e68-9a1f-02c97211a365" containerName="swift-recon-cron" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.514860 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a6426b-c4ef-4874-8f48-a59d830ae08d" containerName="manager" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.515185 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="5226eb1e-f30c-4ef9-a218-d9234255a6ca" containerName="mariadb-account-delete" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.515763 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.525841 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l5js8"/"openshift-service-ca.crt" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.526307 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l5js8"/"kube-root-ca.crt" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.533344 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l5js8/must-gather-sqjdn"] Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.624434 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wdk\" (UniqueName: \"kubernetes.io/projected/1a75c36d-18fc-4133-81d5-a0313a42d9a7-kube-api-access-t2wdk\") pod \"must-gather-sqjdn\" (UID: \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\") " pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.624488 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a75c36d-18fc-4133-81d5-a0313a42d9a7-must-gather-output\") pod \"must-gather-sqjdn\" (UID: \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\") " pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.725888 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a75c36d-18fc-4133-81d5-a0313a42d9a7-must-gather-output\") pod \"must-gather-sqjdn\" (UID: \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\") " pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.725990 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wdk\" (UniqueName: \"kubernetes.io/projected/1a75c36d-18fc-4133-81d5-a0313a42d9a7-kube-api-access-t2wdk\") pod \"must-gather-sqjdn\" (UID: \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\") " pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.726729 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a75c36d-18fc-4133-81d5-a0313a42d9a7-must-gather-output\") pod \"must-gather-sqjdn\" (UID: \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\") " pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.749811 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wdk\" (UniqueName: \"kubernetes.io/projected/1a75c36d-18fc-4133-81d5-a0313a42d9a7-kube-api-access-t2wdk\") pod \"must-gather-sqjdn\" (UID: \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\") " pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 10:57:09 crc kubenswrapper[4761]: I1201 10:57:09.837776 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 10:57:10 crc kubenswrapper[4761]: I1201 10:57:10.296695 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l5js8/must-gather-sqjdn"] Dec 01 10:57:10 crc kubenswrapper[4761]: I1201 10:57:10.831288 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l5js8/must-gather-sqjdn" event={"ID":"1a75c36d-18fc-4133-81d5-a0313a42d9a7","Type":"ContainerStarted","Data":"e01bb8076daf819df06a0652f2a29b203465e13a030495d85391dba0bf5fc04b"} Dec 01 10:57:14 crc kubenswrapper[4761]: I1201 10:57:14.856888 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l5js8/must-gather-sqjdn" event={"ID":"1a75c36d-18fc-4133-81d5-a0313a42d9a7","Type":"ContainerStarted","Data":"2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b"} Dec 01 10:57:15 crc kubenswrapper[4761]: I1201 10:57:15.864148 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l5js8/must-gather-sqjdn" event={"ID":"1a75c36d-18fc-4133-81d5-a0313a42d9a7","Type":"ContainerStarted","Data":"1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc"} Dec 01 10:57:15 crc kubenswrapper[4761]: I1201 10:57:15.880769 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l5js8/must-gather-sqjdn" podStartSLOduration=2.760657048 podStartE2EDuration="6.88075173s" podCreationTimestamp="2025-12-01 10:57:09 +0000 UTC" firstStartedPulling="2025-12-01 10:57:10.313927765 +0000 UTC m=+1569.617686389" lastFinishedPulling="2025-12-01 10:57:14.434022447 +0000 UTC m=+1573.737781071" observedRunningTime="2025-12-01 10:57:15.877599007 +0000 UTC m=+1575.181357631" watchObservedRunningTime="2025-12-01 10:57:15.88075173 +0000 UTC m=+1575.184510354" Dec 01 10:57:28 crc kubenswrapper[4761]: E1201 10:57:28.208726 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:57:28 crc kubenswrapper[4761]: E1201 10:57:28.209199 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:28.709181729 +0000 UTC m=+1588.012940353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:57:28 crc kubenswrapper[4761]: E1201 10:57:28.208783 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:57:28 crc kubenswrapper[4761]: E1201 10:57:28.209336 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:28.709312783 +0000 UTC m=+1588.013071417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:57:28 crc kubenswrapper[4761]: E1201 10:57:28.716860 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:57:28 crc kubenswrapper[4761]: E1201 10:57:28.716886 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:57:28 crc kubenswrapper[4761]: E1201 10:57:28.716951 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:29.716934325 +0000 UTC m=+1589.020692959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:57:28 crc kubenswrapper[4761]: E1201 10:57:28.716973 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:29.716963985 +0000 UTC m=+1589.020722619 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:57:29 crc kubenswrapper[4761]: E1201 10:57:29.729732 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:57:29 crc kubenswrapper[4761]: E1201 10:57:29.729732 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:57:29 crc kubenswrapper[4761]: E1201 10:57:29.729828 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:31.729805014 +0000 UTC m=+1591.033563648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:57:29 crc kubenswrapper[4761]: E1201 10:57:29.729879 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:31.729857145 +0000 UTC m=+1591.033615769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:57:31 crc kubenswrapper[4761]: E1201 10:57:31.755648 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:57:31 crc kubenswrapper[4761]: E1201 10:57:31.755717 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:57:31 crc kubenswrapper[4761]: E1201 10:57:31.755751 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:35.75573209 +0000 UTC m=+1595.059490724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:57:31 crc kubenswrapper[4761]: E1201 10:57:31.755816 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:35.755788401 +0000 UTC m=+1595.059547055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:57:35 crc kubenswrapper[4761]: E1201 10:57:35.808454 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:57:35 crc kubenswrapper[4761]: E1201 10:57:35.809028 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:43.809004907 +0000 UTC m=+1603.112763531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:57:35 crc kubenswrapper[4761]: E1201 10:57:35.808475 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:57:35 crc kubenswrapper[4761]: E1201 10:57:35.809127 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:43.80911209 +0000 UTC m=+1603.112870714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:57:43 crc kubenswrapper[4761]: E1201 10:57:43.816433 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:57:43 crc kubenswrapper[4761]: E1201 10:57:43.816970 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:59.816954116 +0000 UTC m=+1619.120712740 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:57:43 crc kubenswrapper[4761]: E1201 10:57:43.816435 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:57:43 crc kubenswrapper[4761]: E1201 10:57:43.817026 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:57:59.817013537 +0000 UTC m=+1619.120772161 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.358897 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/util/0.log" Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.532463 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/pull/0.log" Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.536338 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/util/0.log" Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.595117 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/pull/0.log" Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.759542 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/pull/0.log" Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.794078 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/util/0.log" Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.794458 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/extract/0.log" Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.913561 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68949bdcb7-pd467_c3915eec-1b53-4ec3-b44c-ead2e1fdfe03/manager/0.log" Dec 01 10:57:49 crc kubenswrapper[4761]: I1201 10:57:49.964886 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-v8vkt_f837fbca-013a-4054-af8d-fcf798f568d0/registry-server/0.log" Dec 01 10:57:59 crc kubenswrapper[4761]: E1201 10:57:59.833634 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:57:59 crc kubenswrapper[4761]: E1201 10:57:59.834834 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:58:31.834777368 +0000 UTC m=+1651.138536032 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:57:59 crc kubenswrapper[4761]: E1201 10:57:59.833728 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:57:59 crc kubenswrapper[4761]: E1201 10:57:59.834991 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:58:31.834954283 +0000 UTC m=+1651.138712917 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:58:02 crc kubenswrapper[4761]: I1201 10:58:02.959731 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-lwd6m_49f94e97-89ed-41ca-b0c1-620d9e69ae81/control-plane-machine-set-operator/0.log" Dec 01 10:58:03 crc kubenswrapper[4761]: I1201 10:58:03.095873 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xzg25_5085aee7-8987-489e-86af-3c11f1a6618d/kube-rbac-proxy/0.log" Dec 01 10:58:03 crc kubenswrapper[4761]: I1201 10:58:03.122898 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xzg25_5085aee7-8987-489e-86af-3c11f1a6618d/machine-api-operator/0.log" Dec 01 10:58:03 crc kubenswrapper[4761]: I1201 10:58:03.850144 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:58:03 crc kubenswrapper[4761]: I1201 10:58:03.850234 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:58:09 crc kubenswrapper[4761]: I1201 10:58:09.332407 4761 scope.go:117] "RemoveContainer" containerID="82014b7e289f2c16471309d7c778f0fea29c824bff1aeb3fdda39a5b7591f7c0" Dec 01 10:58:09 crc kubenswrapper[4761]: I1201 10:58:09.386543 4761 scope.go:117] "RemoveContainer" containerID="f94c946c761272a768bb936be1dec51e638919f72fc9071f22de317a2831e55f" Dec 01 10:58:09 crc kubenswrapper[4761]: I1201 10:58:09.403004 4761 scope.go:117] "RemoveContainer" containerID="dc5e8423924636c631e2c8e920ac2be6d56ab98e4efccc5f1cc3fa410967f98e" Dec 01 10:58:09 crc kubenswrapper[4761]: I1201 10:58:09.452705 4761 scope.go:117] "RemoveContainer" containerID="4fe4fb3bc22aa0fba9ed8bad95da774a5c2e52fc7fd5ef7098a06bc812285540" Dec 01 10:58:09 crc kubenswrapper[4761]: I1201 10:58:09.476040 4761 scope.go:117] "RemoveContainer" containerID="071b7b461dd3bd9ce1a872230779f43f6f52506d4126e6f67ffb65fccb4cd80c" Dec 01 10:58:09 crc kubenswrapper[4761]: I1201 10:58:09.494750 4761 scope.go:117] "RemoveContainer" containerID="2f0c69f6996136859db1de8b25d96f5e0c6b7504d95c33d4b7549f81caed8103" Dec 01 10:58:09 crc kubenswrapper[4761]: I1201 10:58:09.523183 4761 scope.go:117] "RemoveContainer" containerID="6813bcf8145b42d9a5fabc2f4130bb22ec0a387f8da58b71b59ad3e27e2fd514" Dec 01 10:58:09 crc kubenswrapper[4761]: I1201 10:58:09.577423 4761 scope.go:117] "RemoveContainer" containerID="39f373ef322b45ef2ed9f8ed5ff93459091bfbbf73ee766cce03d076759a4a27" Dec 01 10:58:19 crc kubenswrapper[4761]: I1201 10:58:19.564621 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-82n8s_93bcfa9d-c2bd-4d59-9be1-181d49ab1009/controller/0.log" Dec 01 10:58:19 crc kubenswrapper[4761]: I1201 10:58:19.572303 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-82n8s_93bcfa9d-c2bd-4d59-9be1-181d49ab1009/kube-rbac-proxy/0.log" Dec 01 10:58:19 crc kubenswrapper[4761]: I1201 10:58:19.728884 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-frr-files/0.log" Dec 01 10:58:19 crc kubenswrapper[4761]: I1201 10:58:19.883327 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-frr-files/0.log" Dec 01 10:58:19 crc kubenswrapper[4761]: I1201 10:58:19.930174 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-metrics/0.log" Dec 01 10:58:19 crc kubenswrapper[4761]: I1201 10:58:19.932210 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-reloader/0.log" Dec 01 10:58:19 crc kubenswrapper[4761]: I1201 10:58:19.936490 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-reloader/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.083745 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-frr-files/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.109877 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-reloader/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.131779 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-metrics/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.165849 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-metrics/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.280109 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-frr-files/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.286879 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-reloader/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.333673 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/controller/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.350861 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-metrics/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.470360 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/frr-metrics/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.504907 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/kube-rbac-proxy-frr/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.507587 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/kube-rbac-proxy/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.733457 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-t6f4w_d5fccf55-8452-4691-9d4b-d27b6c9e0a2f/frr-k8s-webhook-server/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.768213 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/reloader/0.log" Dec 01 10:58:20 crc kubenswrapper[4761]: I1201 10:58:20.911496 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-66985c5f8b-b6zh4_e1506c16-7214-4b74-a6d5-935646d2bb83/manager/0.log" Dec 01 10:58:21 crc kubenswrapper[4761]: I1201 10:58:21.076836 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56bbcd747-q8n7d_cc5f6c3c-71a1-443c-9c3a-67fc2305dd62/webhook-server/0.log" Dec 01 10:58:21 crc kubenswrapper[4761]: I1201 10:58:21.095519 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/frr/0.log" Dec 01 10:58:21 crc kubenswrapper[4761]: I1201 10:58:21.166218 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hqff8_d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c/kube-rbac-proxy/0.log" Dec 01 10:58:21 crc kubenswrapper[4761]: I1201 10:58:21.391482 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hqff8_d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c/speaker/0.log" Dec 01 10:58:31 crc kubenswrapper[4761]: E1201 10:58:31.841083 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:58:31 crc kubenswrapper[4761]: E1201 10:58:31.841735 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:59:35.841715439 +0000 UTC m=+1715.145474063 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:58:31 crc kubenswrapper[4761]: E1201 10:58:31.841101 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:58:31 crc kubenswrapper[4761]: E1201 10:58:31.841835 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 10:59:35.841803301 +0000 UTC m=+1715.145561965 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:58:33 crc kubenswrapper[4761]: I1201 10:58:33.849789 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:58:33 crc kubenswrapper[4761]: I1201 10:58:33.850305 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:58:35 crc kubenswrapper[4761]: I1201 10:58:35.618540 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3/openstackclient/0.log" Dec 01 10:58:48 crc kubenswrapper[4761]: I1201 10:58:48.672587 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/util/0.log" Dec 01 10:58:48 crc kubenswrapper[4761]: I1201 10:58:48.846572 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/pull/0.log" Dec 01 10:58:48 crc kubenswrapper[4761]: I1201 10:58:48.858928 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/util/0.log" Dec 01 10:58:48 crc kubenswrapper[4761]: I1201 10:58:48.862227 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/pull/0.log" Dec 01 10:58:48 crc kubenswrapper[4761]: I1201 10:58:48.987517 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/extract/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.006892 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/pull/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.008492 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/util/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.144026 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-utilities/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.325008 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-utilities/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.352465 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-content/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.380408 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-content/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.534450 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-content/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.539803 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-utilities/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.717965 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-utilities/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.889979 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/registry-server/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.898516 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-content/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.901340 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-utilities/0.log" Dec 01 10:58:49 crc kubenswrapper[4761]: I1201 10:58:49.964076 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-content/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.117965 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-content/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.148813 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-utilities/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.334192 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lgbgw_04df1b9e-01cf-41e0-af31-dcb2e0512d45/marketplace-operator/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.408287 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-utilities/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.533022 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/registry-server/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.553407 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-utilities/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.556096 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-content/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.614180 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-content/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.757082 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-content/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.844800 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-utilities/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.845063 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/registry-server/0.log" Dec 01 10:58:50 crc kubenswrapper[4761]: I1201 10:58:50.962731 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-utilities/0.log" Dec 01 10:58:51 crc kubenswrapper[4761]: I1201 10:58:51.146684 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-content/0.log" Dec 01 10:58:51 crc kubenswrapper[4761]: I1201 10:58:51.151405 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-content/0.log" Dec 01 10:58:51 crc kubenswrapper[4761]: I1201 10:58:51.171110 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-utilities/0.log" Dec 01 10:58:51 crc kubenswrapper[4761]: I1201 10:58:51.451713 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-content/0.log" Dec 01 10:58:51 crc kubenswrapper[4761]: I1201 10:58:51.481146 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-utilities/0.log" Dec 01 10:58:51 crc kubenswrapper[4761]: I1201 10:58:51.581763 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/registry-server/0.log" Dec 01 10:59:03 crc kubenswrapper[4761]: I1201 10:59:03.850469 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:59:03 crc kubenswrapper[4761]: I1201 10:59:03.850883 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:59:03 crc kubenswrapper[4761]: I1201 10:59:03.850922 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 10:59:03 crc kubenswrapper[4761]: I1201 10:59:03.851421 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:59:03 crc kubenswrapper[4761]: I1201 10:59:03.851465 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" gracePeriod=600 Dec 01 10:59:03 crc kubenswrapper[4761]: E1201 10:59:03.975958 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 10:59:04 crc kubenswrapper[4761]: I1201 10:59:04.646640 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" exitCode=0 Dec 01 10:59:04 crc kubenswrapper[4761]: I1201 10:59:04.646716 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8"} Dec 01 10:59:04 crc kubenswrapper[4761]: I1201 10:59:04.647020 4761 scope.go:117] "RemoveContainer" containerID="7d57787b78893daee12ca3c7dcee8cb3520b06bd08aeb0d4d1cb8f9e5545ff08" Dec 01 10:59:04 crc kubenswrapper[4761]: I1201 10:59:04.647965 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 10:59:04 crc kubenswrapper[4761]: E1201 10:59:04.648169 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 10:59:09 crc kubenswrapper[4761]: I1201 10:59:09.692744 4761 scope.go:117] "RemoveContainer" containerID="aebc17b337b0f856ab6f9c36fc6a95761ed0a40b7c965f962a03e914b4350778" Dec 01 10:59:09 crc kubenswrapper[4761]: I1201 10:59:09.768506 4761 scope.go:117] "RemoveContainer" containerID="dc4fb35971e2f60efc7a8d59fd1284528556b794ba4733b76cb1c50ed7890466" Dec 01 10:59:09 crc kubenswrapper[4761]: I1201 10:59:09.789482 4761 scope.go:117] "RemoveContainer" containerID="7964fbd6798c56ca805db0689a7899be84af2c935218a9cc77e4848aef4e500a" Dec 01 10:59:09 crc kubenswrapper[4761]: I1201 10:59:09.832620 4761 scope.go:117] "RemoveContainer" containerID="24d0f43937ac60404dda8bc82a721a04db080713375c4f92443dac9ba5ebe113" Dec 01 10:59:18 crc kubenswrapper[4761]: I1201 10:59:18.129096 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 10:59:18 crc kubenswrapper[4761]: E1201 10:59:18.130135 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 10:59:33 crc kubenswrapper[4761]: I1201 10:59:33.128411 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 10:59:33 crc kubenswrapper[4761]: E1201 10:59:33.130579 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 10:59:35 crc kubenswrapper[4761]: E1201 10:59:35.929118 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 10:59:35 crc kubenswrapper[4761]: E1201 10:59:35.929230 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:01:37.929203435 +0000 UTC m=+1837.232962089 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 10:59:35 crc kubenswrapper[4761]: E1201 10:59:35.929240 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 10:59:35 crc kubenswrapper[4761]: E1201 10:59:35.929342 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:01:37.929314128 +0000 UTC m=+1837.233072792 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 10:59:47 crc kubenswrapper[4761]: I1201 10:59:47.128793 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 10:59:47 crc kubenswrapper[4761]: E1201 10:59:47.129670 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 10:59:56 crc kubenswrapper[4761]: I1201 10:59:56.055870 4761 generic.go:334] "Generic (PLEG): container finished" podID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerID="2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b" exitCode=0 Dec 01 10:59:56 crc kubenswrapper[4761]: I1201 10:59:56.056001 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l5js8/must-gather-sqjdn" event={"ID":"1a75c36d-18fc-4133-81d5-a0313a42d9a7","Type":"ContainerDied","Data":"2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b"} Dec 01 10:59:56 crc kubenswrapper[4761]: I1201 10:59:56.058802 4761 scope.go:117] "RemoveContainer" containerID="2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b" Dec 01 10:59:56 crc kubenswrapper[4761]: I1201 10:59:56.610423 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l5js8_must-gather-sqjdn_1a75c36d-18fc-4133-81d5-a0313a42d9a7/gather/0.log" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.150153 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb"] Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.151603 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.155356 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.155589 4761 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.158733 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb"] Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.198648 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-secret-volume\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.199187 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d64np\" (UniqueName: \"kubernetes.io/projected/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-kube-api-access-d64np\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.199229 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-config-volume\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.300414 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d64np\" (UniqueName: \"kubernetes.io/projected/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-kube-api-access-d64np\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.300464 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-config-volume\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.300529 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-secret-volume\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.302632 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-config-volume\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.317416 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-secret-volume\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.322222 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d64np\" (UniqueName: \"kubernetes.io/projected/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-kube-api-access-d64np\") pod \"collect-profiles-29409780-wwdgb\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.531171 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:00 crc kubenswrapper[4761]: I1201 11:00:00.962077 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb"] Dec 01 11:00:01 crc kubenswrapper[4761]: I1201 11:00:01.090578 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" event={"ID":"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266","Type":"ContainerStarted","Data":"88e6452aa1093f231142d5ca087af7dfff8b8f5773a2261669b2cad4d4ea0d02"} Dec 01 11:00:01 crc kubenswrapper[4761]: I1201 11:00:01.150938 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:00:01 crc kubenswrapper[4761]: E1201 11:00:01.151289 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:00:02 crc kubenswrapper[4761]: I1201 11:00:02.105011 4761 generic.go:334] "Generic (PLEG): container finished" podID="8fa1e30b-f02f-4bfe-acdd-4b97c42d0266" containerID="c93641595f2440b8f22854864a618988033fc237ab731668b80e95621a548973" exitCode=0 Dec 01 11:00:02 crc kubenswrapper[4761]: I1201 11:00:02.105081 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" event={"ID":"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266","Type":"ContainerDied","Data":"c93641595f2440b8f22854864a618988033fc237ab731668b80e95621a548973"} Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.388491 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l5js8/must-gather-sqjdn"] Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.389483 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-l5js8/must-gather-sqjdn" podUID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerName="copy" containerID="cri-o://1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc" gracePeriod=2 Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.392934 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l5js8/must-gather-sqjdn"] Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.548629 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.647453 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-secret-volume\") pod \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.647571 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d64np\" (UniqueName: \"kubernetes.io/projected/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-kube-api-access-d64np\") pod \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.647614 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-config-volume\") pod \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\" (UID: \"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266\") " Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.648398 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-config-volume" (OuterVolumeSpecName: "config-volume") pod "8fa1e30b-f02f-4bfe-acdd-4b97c42d0266" (UID: "8fa1e30b-f02f-4bfe-acdd-4b97c42d0266"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.654610 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-kube-api-access-d64np" (OuterVolumeSpecName: "kube-api-access-d64np") pod "8fa1e30b-f02f-4bfe-acdd-4b97c42d0266" (UID: "8fa1e30b-f02f-4bfe-acdd-4b97c42d0266"). InnerVolumeSpecName "kube-api-access-d64np". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.655205 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8fa1e30b-f02f-4bfe-acdd-4b97c42d0266" (UID: "8fa1e30b-f02f-4bfe-acdd-4b97c42d0266"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.727060 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l5js8_must-gather-sqjdn_1a75c36d-18fc-4133-81d5-a0313a42d9a7/copy/0.log" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.727581 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.749432 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wdk\" (UniqueName: \"kubernetes.io/projected/1a75c36d-18fc-4133-81d5-a0313a42d9a7-kube-api-access-t2wdk\") pod \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\" (UID: \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\") " Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.749523 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a75c36d-18fc-4133-81d5-a0313a42d9a7-must-gather-output\") pod \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\" (UID: \"1a75c36d-18fc-4133-81d5-a0313a42d9a7\") " Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.749789 4761 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.749805 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d64np\" (UniqueName: \"kubernetes.io/projected/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-kube-api-access-d64np\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.749814 4761 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fa1e30b-f02f-4bfe-acdd-4b97c42d0266-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.755386 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a75c36d-18fc-4133-81d5-a0313a42d9a7-kube-api-access-t2wdk" (OuterVolumeSpecName: "kube-api-access-t2wdk") pod "1a75c36d-18fc-4133-81d5-a0313a42d9a7" (UID: "1a75c36d-18fc-4133-81d5-a0313a42d9a7"). InnerVolumeSpecName "kube-api-access-t2wdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.824465 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a75c36d-18fc-4133-81d5-a0313a42d9a7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1a75c36d-18fc-4133-81d5-a0313a42d9a7" (UID: "1a75c36d-18fc-4133-81d5-a0313a42d9a7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.850539 4761 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a75c36d-18fc-4133-81d5-a0313a42d9a7-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4761]: I1201 11:00:03.850592 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wdk\" (UniqueName: \"kubernetes.io/projected/1a75c36d-18fc-4133-81d5-a0313a42d9a7-kube-api-access-t2wdk\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.120358 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" event={"ID":"8fa1e30b-f02f-4bfe-acdd-4b97c42d0266","Type":"ContainerDied","Data":"88e6452aa1093f231142d5ca087af7dfff8b8f5773a2261669b2cad4d4ea0d02"} Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.120382 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-wwdgb" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.120401 4761 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88e6452aa1093f231142d5ca087af7dfff8b8f5773a2261669b2cad4d4ea0d02" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.121665 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l5js8_must-gather-sqjdn_1a75c36d-18fc-4133-81d5-a0313a42d9a7/copy/0.log" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.121942 4761 generic.go:334] "Generic (PLEG): container finished" podID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerID="1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc" exitCode=143 Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.121977 4761 scope.go:117] "RemoveContainer" containerID="1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.122078 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l5js8/must-gather-sqjdn" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.149755 4761 scope.go:117] "RemoveContainer" containerID="2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.188737 4761 scope.go:117] "RemoveContainer" containerID="1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc" Dec 01 11:00:04 crc kubenswrapper[4761]: E1201 11:00:04.189222 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc\": container with ID starting with 1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc not found: ID does not exist" containerID="1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.189271 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc"} err="failed to get container status \"1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc\": rpc error: code = NotFound desc = could not find container \"1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc\": container with ID starting with 1f72aa8f23c2d6864ce374f6f9c83adc3c3d0bd19850fb7c2a0f4c6615a146fc not found: ID does not exist" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.189297 4761 scope.go:117] "RemoveContainer" containerID="2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b" Dec 01 11:00:04 crc kubenswrapper[4761]: E1201 11:00:04.189615 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b\": container with ID starting with 2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b not found: ID does not exist" containerID="2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b" Dec 01 11:00:04 crc kubenswrapper[4761]: I1201 11:00:04.189669 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b"} err="failed to get container status \"2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b\": rpc error: code = NotFound desc = could not find container \"2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b\": container with ID starting with 2edbc4d26fc1f5bcf6ce446e1e914995ac261277e1862dbdd06cf354b46c868b not found: ID does not exist" Dec 01 11:00:05 crc kubenswrapper[4761]: I1201 11:00:05.135280 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" path="/var/lib/kubelet/pods/1a75c36d-18fc-4133-81d5-a0313a42d9a7/volumes" Dec 01 11:00:09 crc kubenswrapper[4761]: I1201 11:00:09.916525 4761 scope.go:117] "RemoveContainer" containerID="fec549f18510f9be0081d189d5b22bd5048deb784449d03335aa7f9f5e29a4e1" Dec 01 11:00:09 crc kubenswrapper[4761]: I1201 11:00:09.990861 4761 scope.go:117] "RemoveContainer" containerID="6dd719fd2c08d901e32219e95784babf0c628ee0d965c75e8c54e9d06b18e929" Dec 01 11:00:10 crc kubenswrapper[4761]: I1201 11:00:10.016176 4761 scope.go:117] "RemoveContainer" containerID="9270e8169fdb53f4a7cfa62656a35f20480492955354f4cff741caa91330201b" Dec 01 11:00:10 crc kubenswrapper[4761]: I1201 11:00:10.053106 4761 scope.go:117] "RemoveContainer" containerID="ff82762c704ee0d6672b7257584125f1f97359cb54bd7db233dd8b4d58778082" Dec 01 11:00:13 crc kubenswrapper[4761]: I1201 11:00:13.128336 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:00:13 crc kubenswrapper[4761]: E1201 11:00:13.129090 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:00:24 crc kubenswrapper[4761]: I1201 11:00:24.128685 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:00:24 crc kubenswrapper[4761]: E1201 11:00:24.129459 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:00:35 crc kubenswrapper[4761]: I1201 11:00:35.129661 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:00:35 crc kubenswrapper[4761]: E1201 11:00:35.131734 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:00:47 crc kubenswrapper[4761]: I1201 11:00:47.128640 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:00:47 crc kubenswrapper[4761]: E1201 11:00:47.129403 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:00:59 crc kubenswrapper[4761]: I1201 11:00:59.129332 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:00:59 crc kubenswrapper[4761]: E1201 11:00:59.130304 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:01:10 crc kubenswrapper[4761]: I1201 11:01:10.136158 4761 scope.go:117] "RemoveContainer" containerID="4bf439d53e72d7d5552ffba6b5ff6f5739d05ee4e3cc58da728fd6f38dc59a69" Dec 01 11:01:10 crc kubenswrapper[4761]: I1201 11:01:10.211236 4761 scope.go:117] "RemoveContainer" containerID="85ae3cc8dd4564eb71c547ba412e90eebc1add08b5e29ff93fabb2de80f6d76a" Dec 01 11:01:10 crc kubenswrapper[4761]: I1201 11:01:10.235488 4761 scope.go:117] "RemoveContainer" containerID="c07b9f8aa85340d67c1da884aeabeee7fe06be2221e8ccf267e75e75eb3832f9" Dec 01 11:01:10 crc kubenswrapper[4761]: I1201 11:01:10.280138 4761 scope.go:117] "RemoveContainer" containerID="c92a66e140e324f99840149aef24d020cd60d5a14504733d37f2e0c3928548a9" Dec 01 11:01:12 crc kubenswrapper[4761]: I1201 11:01:12.129093 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:01:12 crc kubenswrapper[4761]: E1201 11:01:12.129828 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:01:25 crc kubenswrapper[4761]: I1201 11:01:25.128654 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:01:25 crc kubenswrapper[4761]: E1201 11:01:25.129844 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:01:37 crc kubenswrapper[4761]: E1201 11:01:37.959636 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 11:01:37 crc kubenswrapper[4761]: E1201 11:01:37.959735 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 11:01:37 crc kubenswrapper[4761]: E1201 11:01:37.960445 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:03:39.960415464 +0000 UTC m=+1959.264174098 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 11:01:37 crc kubenswrapper[4761]: E1201 11:01:37.960537 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:03:39.960501146 +0000 UTC m=+1959.264259800 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 11:01:39 crc kubenswrapper[4761]: I1201 11:01:39.129486 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:01:39 crc kubenswrapper[4761]: E1201 11:01:39.129945 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:01:54 crc kubenswrapper[4761]: I1201 11:01:54.128678 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:01:54 crc kubenswrapper[4761]: E1201 11:01:54.129498 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:02:08 crc kubenswrapper[4761]: I1201 11:02:08.128648 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:02:08 crc kubenswrapper[4761]: E1201 11:02:08.129441 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:02:20 crc kubenswrapper[4761]: I1201 11:02:20.128650 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:02:20 crc kubenswrapper[4761]: E1201 11:02:20.129356 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.975110 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9mbns/must-gather-mxjh8"] Dec 01 11:02:28 crc kubenswrapper[4761]: E1201 11:02:28.976137 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerName="copy" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.976161 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerName="copy" Dec 01 11:02:28 crc kubenswrapper[4761]: E1201 11:02:28.976176 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa1e30b-f02f-4bfe-acdd-4b97c42d0266" containerName="collect-profiles" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.976188 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa1e30b-f02f-4bfe-acdd-4b97c42d0266" containerName="collect-profiles" Dec 01 11:02:28 crc kubenswrapper[4761]: E1201 11:02:28.976214 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerName="gather" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.976226 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerName="gather" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.976421 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerName="gather" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.976440 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa1e30b-f02f-4bfe-acdd-4b97c42d0266" containerName="collect-profiles" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.976473 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a75c36d-18fc-4133-81d5-a0313a42d9a7" containerName="copy" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.977456 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.979834 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9mbns"/"openshift-service-ca.crt" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.994398 4761 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9mbns"/"kube-root-ca.crt" Dec 01 11:02:28 crc kubenswrapper[4761]: I1201 11:02:28.999592 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9mbns/must-gather-mxjh8"] Dec 01 11:02:29 crc kubenswrapper[4761]: I1201 11:02:29.031182 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh7c2\" (UniqueName: \"kubernetes.io/projected/639cd093-0007-45df-b8f9-e2c36cb54554-kube-api-access-rh7c2\") pod \"must-gather-mxjh8\" (UID: \"639cd093-0007-45df-b8f9-e2c36cb54554\") " pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:02:29 crc kubenswrapper[4761]: I1201 11:02:29.031327 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/639cd093-0007-45df-b8f9-e2c36cb54554-must-gather-output\") pod \"must-gather-mxjh8\" (UID: \"639cd093-0007-45df-b8f9-e2c36cb54554\") " pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:02:29 crc kubenswrapper[4761]: I1201 11:02:29.131956 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/639cd093-0007-45df-b8f9-e2c36cb54554-must-gather-output\") pod \"must-gather-mxjh8\" (UID: \"639cd093-0007-45df-b8f9-e2c36cb54554\") " pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:02:29 crc kubenswrapper[4761]: I1201 11:02:29.132017 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh7c2\" (UniqueName: \"kubernetes.io/projected/639cd093-0007-45df-b8f9-e2c36cb54554-kube-api-access-rh7c2\") pod \"must-gather-mxjh8\" (UID: \"639cd093-0007-45df-b8f9-e2c36cb54554\") " pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:02:29 crc kubenswrapper[4761]: I1201 11:02:29.132815 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/639cd093-0007-45df-b8f9-e2c36cb54554-must-gather-output\") pod \"must-gather-mxjh8\" (UID: \"639cd093-0007-45df-b8f9-e2c36cb54554\") " pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:02:29 crc kubenswrapper[4761]: I1201 11:02:29.149715 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh7c2\" (UniqueName: \"kubernetes.io/projected/639cd093-0007-45df-b8f9-e2c36cb54554-kube-api-access-rh7c2\") pod \"must-gather-mxjh8\" (UID: \"639cd093-0007-45df-b8f9-e2c36cb54554\") " pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:02:29 crc kubenswrapper[4761]: I1201 11:02:29.303119 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:02:30 crc kubenswrapper[4761]: I1201 11:02:29.505639 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9mbns/must-gather-mxjh8"] Dec 01 11:02:30 crc kubenswrapper[4761]: I1201 11:02:30.270095 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mbns/must-gather-mxjh8" event={"ID":"639cd093-0007-45df-b8f9-e2c36cb54554","Type":"ContainerStarted","Data":"978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a"} Dec 01 11:02:30 crc kubenswrapper[4761]: I1201 11:02:30.270392 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mbns/must-gather-mxjh8" event={"ID":"639cd093-0007-45df-b8f9-e2c36cb54554","Type":"ContainerStarted","Data":"9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03"} Dec 01 11:02:30 crc kubenswrapper[4761]: I1201 11:02:30.270405 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mbns/must-gather-mxjh8" event={"ID":"639cd093-0007-45df-b8f9-e2c36cb54554","Type":"ContainerStarted","Data":"14ccb31de605375cb5206e72ef298765dad6a569cbf29cfb3c3255982a60bb23"} Dec 01 11:02:30 crc kubenswrapper[4761]: I1201 11:02:30.290608 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9mbns/must-gather-mxjh8" podStartSLOduration=2.290590804 podStartE2EDuration="2.290590804s" podCreationTimestamp="2025-12-01 11:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:02:30.290250425 +0000 UTC m=+1889.594009049" watchObservedRunningTime="2025-12-01 11:02:30.290590804 +0000 UTC m=+1889.594349438" Dec 01 11:02:33 crc kubenswrapper[4761]: I1201 11:02:33.128589 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:02:33 crc kubenswrapper[4761]: E1201 11:02:33.129384 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:02:46 crc kubenswrapper[4761]: I1201 11:02:46.129050 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:02:46 crc kubenswrapper[4761]: E1201 11:02:46.130039 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:02:57 crc kubenswrapper[4761]: I1201 11:02:57.128697 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:02:57 crc kubenswrapper[4761]: E1201 11:02:57.130832 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:03:04 crc kubenswrapper[4761]: I1201 11:03:04.835501 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/util/0.log" Dec 01 11:03:05 crc kubenswrapper[4761]: I1201 11:03:05.015692 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/util/0.log" Dec 01 11:03:05 crc kubenswrapper[4761]: I1201 11:03:05.028392 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/pull/0.log" Dec 01 11:03:05 crc kubenswrapper[4761]: I1201 11:03:05.056303 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/pull/0.log" Dec 01 11:03:05 crc kubenswrapper[4761]: I1201 11:03:05.202894 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/extract/0.log" Dec 01 11:03:05 crc kubenswrapper[4761]: I1201 11:03:05.203220 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/pull/0.log" Dec 01 11:03:05 crc kubenswrapper[4761]: I1201 11:03:05.215310 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368ctc79x_b97b53a1-4f2c-457c-8d54-af349c67f688/util/0.log" Dec 01 11:03:05 crc kubenswrapper[4761]: I1201 11:03:05.356691 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68949bdcb7-pd467_c3915eec-1b53-4ec3-b44c-ead2e1fdfe03/manager/0.log" Dec 01 11:03:05 crc kubenswrapper[4761]: I1201 11:03:05.415346 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-v8vkt_f837fbca-013a-4054-af8d-fcf798f568d0/registry-server/0.log" Dec 01 11:03:10 crc kubenswrapper[4761]: I1201 11:03:10.392812 4761 scope.go:117] "RemoveContainer" containerID="8b7e9db98de6dcba21deb66f0a0dd48732e37f6cf224f0560f1a3c7f1a40c302" Dec 01 11:03:11 crc kubenswrapper[4761]: I1201 11:03:11.131589 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:03:11 crc kubenswrapper[4761]: E1201 11:03:11.132165 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:03:19 crc kubenswrapper[4761]: I1201 11:03:19.814975 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-lwd6m_49f94e97-89ed-41ca-b0c1-620d9e69ae81/control-plane-machine-set-operator/0.log" Dec 01 11:03:19 crc kubenswrapper[4761]: I1201 11:03:19.970112 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xzg25_5085aee7-8987-489e-86af-3c11f1a6618d/kube-rbac-proxy/0.log" Dec 01 11:03:19 crc kubenswrapper[4761]: I1201 11:03:19.978141 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xzg25_5085aee7-8987-489e-86af-3c11f1a6618d/machine-api-operator/0.log" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.454219 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rz8xc"] Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.455424 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.460308 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rz8xc"] Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.629021 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p2nr\" (UniqueName: \"kubernetes.io/projected/d2c4705b-568d-4623-aa35-84b44bb24939-kube-api-access-9p2nr\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.629356 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-catalog-content\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.630188 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-utilities\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.732465 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-utilities\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.732690 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p2nr\" (UniqueName: \"kubernetes.io/projected/d2c4705b-568d-4623-aa35-84b44bb24939-kube-api-access-9p2nr\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.732730 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-catalog-content\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.732979 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-utilities\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.733338 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-catalog-content\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.752271 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p2nr\" (UniqueName: \"kubernetes.io/projected/d2c4705b-568d-4623-aa35-84b44bb24939-kube-api-access-9p2nr\") pod \"certified-operators-rz8xc\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:21 crc kubenswrapper[4761]: I1201 11:03:21.810313 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:22 crc kubenswrapper[4761]: I1201 11:03:22.069930 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rz8xc"] Dec 01 11:03:22 crc kubenswrapper[4761]: I1201 11:03:22.596576 4761 generic.go:334] "Generic (PLEG): container finished" podID="d2c4705b-568d-4623-aa35-84b44bb24939" containerID="b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69" exitCode=0 Dec 01 11:03:22 crc kubenswrapper[4761]: I1201 11:03:22.596665 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz8xc" event={"ID":"d2c4705b-568d-4623-aa35-84b44bb24939","Type":"ContainerDied","Data":"b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69"} Dec 01 11:03:22 crc kubenswrapper[4761]: I1201 11:03:22.596870 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz8xc" event={"ID":"d2c4705b-568d-4623-aa35-84b44bb24939","Type":"ContainerStarted","Data":"5efeb4b315a4a91fdd4228ba69c945ba8d53df1060b838abd8f53ec549634878"} Dec 01 11:03:22 crc kubenswrapper[4761]: I1201 11:03:22.598083 4761 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:03:23 crc kubenswrapper[4761]: I1201 11:03:23.603052 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz8xc" event={"ID":"d2c4705b-568d-4623-aa35-84b44bb24939","Type":"ContainerStarted","Data":"1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813"} Dec 01 11:03:24 crc kubenswrapper[4761]: I1201 11:03:24.610118 4761 generic.go:334] "Generic (PLEG): container finished" podID="d2c4705b-568d-4623-aa35-84b44bb24939" containerID="1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813" exitCode=0 Dec 01 11:03:24 crc kubenswrapper[4761]: I1201 11:03:24.610167 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz8xc" event={"ID":"d2c4705b-568d-4623-aa35-84b44bb24939","Type":"ContainerDied","Data":"1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813"} Dec 01 11:03:25 crc kubenswrapper[4761]: I1201 11:03:25.128275 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:03:25 crc kubenswrapper[4761]: E1201 11:03:25.128511 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:03:25 crc kubenswrapper[4761]: I1201 11:03:25.617413 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz8xc" event={"ID":"d2c4705b-568d-4623-aa35-84b44bb24939","Type":"ContainerStarted","Data":"2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48"} Dec 01 11:03:25 crc kubenswrapper[4761]: I1201 11:03:25.642608 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rz8xc" podStartSLOduration=2.115525707 podStartE2EDuration="4.64258814s" podCreationTimestamp="2025-12-01 11:03:21 +0000 UTC" firstStartedPulling="2025-12-01 11:03:22.597865403 +0000 UTC m=+1941.901624027" lastFinishedPulling="2025-12-01 11:03:25.124927836 +0000 UTC m=+1944.428686460" observedRunningTime="2025-12-01 11:03:25.636933439 +0000 UTC m=+1944.940692073" watchObservedRunningTime="2025-12-01 11:03:25.64258814 +0000 UTC m=+1944.946346774" Dec 01 11:03:31 crc kubenswrapper[4761]: I1201 11:03:31.811078 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:31 crc kubenswrapper[4761]: I1201 11:03:31.812785 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:31 crc kubenswrapper[4761]: I1201 11:03:31.879273 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:32 crc kubenswrapper[4761]: I1201 11:03:32.745901 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:32 crc kubenswrapper[4761]: I1201 11:03:32.801572 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rz8xc"] Dec 01 11:03:34 crc kubenswrapper[4761]: I1201 11:03:34.685698 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rz8xc" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" containerName="registry-server" containerID="cri-o://2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48" gracePeriod=2 Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.692341 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.695365 4761 generic.go:334] "Generic (PLEG): container finished" podID="d2c4705b-568d-4623-aa35-84b44bb24939" containerID="2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48" exitCode=0 Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.695409 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz8xc" event={"ID":"d2c4705b-568d-4623-aa35-84b44bb24939","Type":"ContainerDied","Data":"2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48"} Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.695441 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz8xc" event={"ID":"d2c4705b-568d-4623-aa35-84b44bb24939","Type":"ContainerDied","Data":"5efeb4b315a4a91fdd4228ba69c945ba8d53df1060b838abd8f53ec549634878"} Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.695466 4761 scope.go:117] "RemoveContainer" containerID="2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.730114 4761 scope.go:117] "RemoveContainer" containerID="1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.757773 4761 scope.go:117] "RemoveContainer" containerID="b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.785702 4761 scope.go:117] "RemoveContainer" containerID="2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48" Dec 01 11:03:35 crc kubenswrapper[4761]: E1201 11:03:35.786260 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48\": container with ID starting with 2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48 not found: ID does not exist" containerID="2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.786313 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48"} err="failed to get container status \"2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48\": rpc error: code = NotFound desc = could not find container \"2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48\": container with ID starting with 2609f1824c8f6398892003cb4d443de8fbc8dfeeded5ea47c437a989cfea4d48 not found: ID does not exist" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.786342 4761 scope.go:117] "RemoveContainer" containerID="1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813" Dec 01 11:03:35 crc kubenswrapper[4761]: E1201 11:03:35.786744 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813\": container with ID starting with 1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813 not found: ID does not exist" containerID="1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.786774 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813"} err="failed to get container status \"1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813\": rpc error: code = NotFound desc = could not find container \"1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813\": container with ID starting with 1cf329de648c585291fbd859ee1b909f41dde1690a8ca0d0a2b6bc33d98f4813 not found: ID does not exist" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.786796 4761 scope.go:117] "RemoveContainer" containerID="b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69" Dec 01 11:03:35 crc kubenswrapper[4761]: E1201 11:03:35.787110 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69\": container with ID starting with b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69 not found: ID does not exist" containerID="b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.787134 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69"} err="failed to get container status \"b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69\": rpc error: code = NotFound desc = could not find container \"b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69\": container with ID starting with b80ac1e39d361597f98bb8132073b9a16bf5f72d36cf76fb9aeb84f5f89b9b69 not found: ID does not exist" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.850843 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p2nr\" (UniqueName: \"kubernetes.io/projected/d2c4705b-568d-4623-aa35-84b44bb24939-kube-api-access-9p2nr\") pod \"d2c4705b-568d-4623-aa35-84b44bb24939\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.850904 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-utilities\") pod \"d2c4705b-568d-4623-aa35-84b44bb24939\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.850958 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-catalog-content\") pod \"d2c4705b-568d-4623-aa35-84b44bb24939\" (UID: \"d2c4705b-568d-4623-aa35-84b44bb24939\") " Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.851977 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-utilities" (OuterVolumeSpecName: "utilities") pod "d2c4705b-568d-4623-aa35-84b44bb24939" (UID: "d2c4705b-568d-4623-aa35-84b44bb24939"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.858026 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c4705b-568d-4623-aa35-84b44bb24939-kube-api-access-9p2nr" (OuterVolumeSpecName: "kube-api-access-9p2nr") pod "d2c4705b-568d-4623-aa35-84b44bb24939" (UID: "d2c4705b-568d-4623-aa35-84b44bb24939"). InnerVolumeSpecName "kube-api-access-9p2nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.913928 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2c4705b-568d-4623-aa35-84b44bb24939" (UID: "d2c4705b-568d-4623-aa35-84b44bb24939"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.952617 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p2nr\" (UniqueName: \"kubernetes.io/projected/d2c4705b-568d-4623-aa35-84b44bb24939-kube-api-access-9p2nr\") on node \"crc\" DevicePath \"\"" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.952666 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:03:35 crc kubenswrapper[4761]: I1201 11:03:35.952686 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c4705b-568d-4623-aa35-84b44bb24939-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:03:36 crc kubenswrapper[4761]: I1201 11:03:36.703406 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rz8xc" Dec 01 11:03:36 crc kubenswrapper[4761]: I1201 11:03:36.748488 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rz8xc"] Dec 01 11:03:36 crc kubenswrapper[4761]: I1201 11:03:36.754412 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rz8xc"] Dec 01 11:03:37 crc kubenswrapper[4761]: I1201 11:03:37.135845 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" path="/var/lib/kubelet/pods/d2c4705b-568d-4623-aa35-84b44bb24939/volumes" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.261234 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-82n8s_93bcfa9d-c2bd-4d59-9be1-181d49ab1009/kube-rbac-proxy/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.307289 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-82n8s_93bcfa9d-c2bd-4d59-9be1-181d49ab1009/controller/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.432366 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-frr-files/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.620603 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-reloader/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.627407 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-metrics/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.652873 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-frr-files/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.698490 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-reloader/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.804638 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-frr-files/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.844437 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-reloader/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.849609 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-metrics/0.log" Dec 01 11:03:38 crc kubenswrapper[4761]: I1201 11:03:38.908663 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-metrics/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.098791 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-metrics/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.104050 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-reloader/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.118507 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/controller/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.118673 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/cp-frr-files/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.280310 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/kube-rbac-proxy/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.288376 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/kube-rbac-proxy-frr/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.295325 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/frr-metrics/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.506474 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/reloader/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.527642 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-t6f4w_d5fccf55-8452-4691-9d4b-d27b6c9e0a2f/frr-k8s-webhook-server/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.705835 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-66985c5f8b-b6zh4_e1506c16-7214-4b74-a6d5-935646d2bb83/manager/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.821621 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56bbcd747-q8n7d_cc5f6c3c-71a1-443c-9c3a-67fc2305dd62/webhook-server/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.922868 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtww8_4bdee341-d432-4260-8334-4c47aee1593a/frr/0.log" Dec 01 11:03:39 crc kubenswrapper[4761]: I1201 11:03:39.929732 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hqff8_d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c/kube-rbac-proxy/0.log" Dec 01 11:03:40 crc kubenswrapper[4761]: E1201 11:03:40.004013 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 11:03:40 crc kubenswrapper[4761]: E1201 11:03:40.004094 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:05:42.004078383 +0000 UTC m=+2081.307837017 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 11:03:40 crc kubenswrapper[4761]: E1201 11:03:40.004581 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 11:03:40 crc kubenswrapper[4761]: E1201 11:03:40.004621 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:05:42.004611018 +0000 UTC m=+2081.308369662 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 11:03:40 crc kubenswrapper[4761]: I1201 11:03:40.128199 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:03:40 crc kubenswrapper[4761]: E1201 11:03:40.128353 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:03:40 crc kubenswrapper[4761]: I1201 11:03:40.162801 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hqff8_d0b5ce7c-d83f-4d19-9c53-b02f6c73c39c/speaker/0.log" Dec 01 11:03:54 crc kubenswrapper[4761]: I1201 11:03:54.272431 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3/openstackclient/0.log" Dec 01 11:03:55 crc kubenswrapper[4761]: I1201 11:03:55.128723 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:03:55 crc kubenswrapper[4761]: E1201 11:03:55.128980 4761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qjx5r_openshift-machine-config-operator(eaf56ffe-a6c0-446a-81db-deae9bd72c7c)\"" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" Dec 01 11:04:07 crc kubenswrapper[4761]: I1201 11:04:07.392724 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/util/0.log" Dec 01 11:04:07 crc kubenswrapper[4761]: I1201 11:04:07.573261 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/util/0.log" Dec 01 11:04:07 crc kubenswrapper[4761]: I1201 11:04:07.588955 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/pull/0.log" Dec 01 11:04:07 crc kubenswrapper[4761]: I1201 11:04:07.593194 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/pull/0.log" Dec 01 11:04:07 crc kubenswrapper[4761]: I1201 11:04:07.771186 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/extract/0.log" Dec 01 11:04:07 crc kubenswrapper[4761]: I1201 11:04:07.779118 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/pull/0.log" Dec 01 11:04:07 crc kubenswrapper[4761]: I1201 11:04:07.795629 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83skpbk_9a577166-579f-48b6-92c0-39505fdf48f5/util/0.log" Dec 01 11:04:07 crc kubenswrapper[4761]: I1201 11:04:07.931237 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-utilities/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.064906 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-utilities/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.094910 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-content/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.095723 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-content/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.230659 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-content/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.243060 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/extract-utilities/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.438312 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-utilities/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.632633 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-utilities/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.658136 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-content/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.677563 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5vk2_4b483973-7f6c-4581-b676-d19f25446c7a/registry-server/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.677883 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-content/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.795082 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-utilities/0.log" Dec 01 11:04:08 crc kubenswrapper[4761]: I1201 11:04:08.805311 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/extract-content/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.040464 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lgbgw_04df1b9e-01cf-41e0-af31-dcb2e0512d45/marketplace-operator/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.114238 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-utilities/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.220803 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-utilities/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.296406 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-content/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.323976 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p6sdd_d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/registry-server/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.325778 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-content/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.432625 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-content/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.453874 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/extract-utilities/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.582199 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8jdwr_687031d5-0ddd-4dee-a39a-9b0a3a32bf69/registry-server/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.624256 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-utilities/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.784968 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-utilities/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.791416 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-content/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.801410 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-content/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.992957 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-content/0.log" Dec 01 11:04:09 crc kubenswrapper[4761]: I1201 11:04:09.995561 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/extract-utilities/0.log" Dec 01 11:04:10 crc kubenswrapper[4761]: I1201 11:04:10.128009 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:04:10 crc kubenswrapper[4761]: I1201 11:04:10.144364 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jf2tz_3393fe92-7c80-4229-bc37-a12db29df394/registry-server/0.log" Dec 01 11:04:10 crc kubenswrapper[4761]: I1201 11:04:10.945706 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"f00997390931adedaeec0ca2a4b2fb2cd14223bf986f1cc6a5dd25b14eaf01e2"} Dec 01 11:05:16 crc kubenswrapper[4761]: I1201 11:05:16.456270 4761 generic.go:334] "Generic (PLEG): container finished" podID="639cd093-0007-45df-b8f9-e2c36cb54554" containerID="9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03" exitCode=0 Dec 01 11:05:16 crc kubenswrapper[4761]: I1201 11:05:16.456626 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mbns/must-gather-mxjh8" event={"ID":"639cd093-0007-45df-b8f9-e2c36cb54554","Type":"ContainerDied","Data":"9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03"} Dec 01 11:05:16 crc kubenswrapper[4761]: I1201 11:05:16.457574 4761 scope.go:117] "RemoveContainer" containerID="9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03" Dec 01 11:05:16 crc kubenswrapper[4761]: I1201 11:05:16.830467 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9mbns_must-gather-mxjh8_639cd093-0007-45df-b8f9-e2c36cb54554/gather/0.log" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.405755 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bn9zb"] Dec 01 11:05:24 crc kubenswrapper[4761]: E1201 11:05:24.406496 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" containerName="extract-utilities" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.406516 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" containerName="extract-utilities" Dec 01 11:05:24 crc kubenswrapper[4761]: E1201 11:05:24.406531 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" containerName="registry-server" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.406539 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" containerName="registry-server" Dec 01 11:05:24 crc kubenswrapper[4761]: E1201 11:05:24.406588 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" containerName="extract-content" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.406598 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" containerName="extract-content" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.406726 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c4705b-568d-4623-aa35-84b44bb24939" containerName="registry-server" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.407454 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.424090 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn9zb"] Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.452141 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-utilities\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.452234 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z95g5\" (UniqueName: \"kubernetes.io/projected/89b45f51-44d5-475f-a58f-05e023317582-kube-api-access-z95g5\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.452304 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-catalog-content\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.553619 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-utilities\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.553667 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z95g5\" (UniqueName: \"kubernetes.io/projected/89b45f51-44d5-475f-a58f-05e023317582-kube-api-access-z95g5\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.553702 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-catalog-content\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.554240 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-catalog-content\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.554370 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-utilities\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.582483 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z95g5\" (UniqueName: \"kubernetes.io/projected/89b45f51-44d5-475f-a58f-05e023317582-kube-api-access-z95g5\") pod \"redhat-operators-bn9zb\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:24 crc kubenswrapper[4761]: I1201 11:05:24.728888 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.148419 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn9zb"] Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.245701 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9mbns/must-gather-mxjh8"] Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.246350 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9mbns/must-gather-mxjh8" podUID="639cd093-0007-45df-b8f9-e2c36cb54554" containerName="copy" containerID="cri-o://978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a" gracePeriod=2 Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.250215 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9mbns/must-gather-mxjh8"] Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.537836 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9mbns_must-gather-mxjh8_639cd093-0007-45df-b8f9-e2c36cb54554/copy/0.log" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.538400 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.540938 4761 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9mbns_must-gather-mxjh8_639cd093-0007-45df-b8f9-e2c36cb54554/copy/0.log" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.541348 4761 generic.go:334] "Generic (PLEG): container finished" podID="639cd093-0007-45df-b8f9-e2c36cb54554" containerID="978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a" exitCode=143 Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.541465 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mbns/must-gather-mxjh8" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.541479 4761 scope.go:117] "RemoveContainer" containerID="978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.543348 4761 generic.go:334] "Generic (PLEG): container finished" podID="89b45f51-44d5-475f-a58f-05e023317582" containerID="8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61" exitCode=0 Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.543386 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn9zb" event={"ID":"89b45f51-44d5-475f-a58f-05e023317582","Type":"ContainerDied","Data":"8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61"} Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.543411 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn9zb" event={"ID":"89b45f51-44d5-475f-a58f-05e023317582","Type":"ContainerStarted","Data":"a5f6700ff00104f8aa4c53f55be2ddeec88c1dd15d3cff457ecca736710a87fe"} Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.565977 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh7c2\" (UniqueName: \"kubernetes.io/projected/639cd093-0007-45df-b8f9-e2c36cb54554-kube-api-access-rh7c2\") pod \"639cd093-0007-45df-b8f9-e2c36cb54554\" (UID: \"639cd093-0007-45df-b8f9-e2c36cb54554\") " Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.566325 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/639cd093-0007-45df-b8f9-e2c36cb54554-must-gather-output\") pod \"639cd093-0007-45df-b8f9-e2c36cb54554\" (UID: \"639cd093-0007-45df-b8f9-e2c36cb54554\") " Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.592370 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/639cd093-0007-45df-b8f9-e2c36cb54554-kube-api-access-rh7c2" (OuterVolumeSpecName: "kube-api-access-rh7c2") pod "639cd093-0007-45df-b8f9-e2c36cb54554" (UID: "639cd093-0007-45df-b8f9-e2c36cb54554"). InnerVolumeSpecName "kube-api-access-rh7c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.592713 4761 scope.go:117] "RemoveContainer" containerID="9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.654855 4761 scope.go:117] "RemoveContainer" containerID="978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a" Dec 01 11:05:25 crc kubenswrapper[4761]: E1201 11:05:25.655606 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a\": container with ID starting with 978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a not found: ID does not exist" containerID="978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.655646 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a"} err="failed to get container status \"978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a\": rpc error: code = NotFound desc = could not find container \"978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a\": container with ID starting with 978ef6a6bdc3f08d6eebba46fe06e9a8982db2fcd07415c178ca3935e0a7313a not found: ID does not exist" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.655671 4761 scope.go:117] "RemoveContainer" containerID="9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03" Dec 01 11:05:25 crc kubenswrapper[4761]: E1201 11:05:25.656007 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03\": container with ID starting with 9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03 not found: ID does not exist" containerID="9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.656033 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03"} err="failed to get container status \"9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03\": rpc error: code = NotFound desc = could not find container \"9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03\": container with ID starting with 9c2b4a7e4c9b56699678f2ecbac0d035f068166f1da0bf38e5985ecbaa4aff03 not found: ID does not exist" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.663242 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/639cd093-0007-45df-b8f9-e2c36cb54554-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "639cd093-0007-45df-b8f9-e2c36cb54554" (UID: "639cd093-0007-45df-b8f9-e2c36cb54554"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.668092 4761 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/639cd093-0007-45df-b8f9-e2c36cb54554-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:25 crc kubenswrapper[4761]: I1201 11:05:25.668133 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh7c2\" (UniqueName: \"kubernetes.io/projected/639cd093-0007-45df-b8f9-e2c36cb54554-kube-api-access-rh7c2\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:27 crc kubenswrapper[4761]: I1201 11:05:27.134460 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="639cd093-0007-45df-b8f9-e2c36cb54554" path="/var/lib/kubelet/pods/639cd093-0007-45df-b8f9-e2c36cb54554/volumes" Dec 01 11:05:27 crc kubenswrapper[4761]: I1201 11:05:27.560196 4761 generic.go:334] "Generic (PLEG): container finished" podID="89b45f51-44d5-475f-a58f-05e023317582" containerID="7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901" exitCode=0 Dec 01 11:05:27 crc kubenswrapper[4761]: I1201 11:05:27.560234 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn9zb" event={"ID":"89b45f51-44d5-475f-a58f-05e023317582","Type":"ContainerDied","Data":"7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901"} Dec 01 11:05:28 crc kubenswrapper[4761]: I1201 11:05:28.567821 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn9zb" event={"ID":"89b45f51-44d5-475f-a58f-05e023317582","Type":"ContainerStarted","Data":"56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661"} Dec 01 11:05:28 crc kubenswrapper[4761]: I1201 11:05:28.587356 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bn9zb" podStartSLOduration=1.9510370780000001 podStartE2EDuration="4.587333646s" podCreationTimestamp="2025-12-01 11:05:24 +0000 UTC" firstStartedPulling="2025-12-01 11:05:25.544955791 +0000 UTC m=+2064.848714415" lastFinishedPulling="2025-12-01 11:05:28.181252359 +0000 UTC m=+2067.485010983" observedRunningTime="2025-12-01 11:05:28.585004163 +0000 UTC m=+2067.888762787" watchObservedRunningTime="2025-12-01 11:05:28.587333646 +0000 UTC m=+2067.891092270" Dec 01 11:05:34 crc kubenswrapper[4761]: I1201 11:05:34.730076 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:34 crc kubenswrapper[4761]: I1201 11:05:34.730902 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:34 crc kubenswrapper[4761]: I1201 11:05:34.800827 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:35 crc kubenswrapper[4761]: I1201 11:05:35.658245 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:35 crc kubenswrapper[4761]: I1201 11:05:35.704858 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bn9zb"] Dec 01 11:05:37 crc kubenswrapper[4761]: I1201 11:05:37.623266 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bn9zb" podUID="89b45f51-44d5-475f-a58f-05e023317582" containerName="registry-server" containerID="cri-o://56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661" gracePeriod=2 Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.477463 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.559005 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z95g5\" (UniqueName: \"kubernetes.io/projected/89b45f51-44d5-475f-a58f-05e023317582-kube-api-access-z95g5\") pod \"89b45f51-44d5-475f-a58f-05e023317582\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.559312 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-utilities\") pod \"89b45f51-44d5-475f-a58f-05e023317582\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.559435 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-catalog-content\") pod \"89b45f51-44d5-475f-a58f-05e023317582\" (UID: \"89b45f51-44d5-475f-a58f-05e023317582\") " Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.560241 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-utilities" (OuterVolumeSpecName: "utilities") pod "89b45f51-44d5-475f-a58f-05e023317582" (UID: "89b45f51-44d5-475f-a58f-05e023317582"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.572373 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b45f51-44d5-475f-a58f-05e023317582-kube-api-access-z95g5" (OuterVolumeSpecName: "kube-api-access-z95g5") pod "89b45f51-44d5-475f-a58f-05e023317582" (UID: "89b45f51-44d5-475f-a58f-05e023317582"). InnerVolumeSpecName "kube-api-access-z95g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.648999 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn9zb" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.649071 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn9zb" event={"ID":"89b45f51-44d5-475f-a58f-05e023317582","Type":"ContainerDied","Data":"56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661"} Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.649134 4761 scope.go:117] "RemoveContainer" containerID="56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.648856 4761 generic.go:334] "Generic (PLEG): container finished" podID="89b45f51-44d5-475f-a58f-05e023317582" containerID="56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661" exitCode=0 Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.649667 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn9zb" event={"ID":"89b45f51-44d5-475f-a58f-05e023317582","Type":"ContainerDied","Data":"a5f6700ff00104f8aa4c53f55be2ddeec88c1dd15d3cff457ecca736710a87fe"} Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.664480 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z95g5\" (UniqueName: \"kubernetes.io/projected/89b45f51-44d5-475f-a58f-05e023317582-kube-api-access-z95g5\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.664521 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.682286 4761 scope.go:117] "RemoveContainer" containerID="7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.697319 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89b45f51-44d5-475f-a58f-05e023317582" (UID: "89b45f51-44d5-475f-a58f-05e023317582"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.706797 4761 scope.go:117] "RemoveContainer" containerID="8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.741356 4761 scope.go:117] "RemoveContainer" containerID="56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661" Dec 01 11:05:39 crc kubenswrapper[4761]: E1201 11:05:39.741985 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661\": container with ID starting with 56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661 not found: ID does not exist" containerID="56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.742046 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661"} err="failed to get container status \"56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661\": rpc error: code = NotFound desc = could not find container \"56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661\": container with ID starting with 56762ec1fedfbd08ff7f65f6ac049796427fd51175961c6801f58f83cf4e3661 not found: ID does not exist" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.742086 4761 scope.go:117] "RemoveContainer" containerID="7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901" Dec 01 11:05:39 crc kubenswrapper[4761]: E1201 11:05:39.742575 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901\": container with ID starting with 7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901 not found: ID does not exist" containerID="7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.742608 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901"} err="failed to get container status \"7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901\": rpc error: code = NotFound desc = could not find container \"7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901\": container with ID starting with 7346a128c8e426f22f7707fd1c7b9753c2bab4d9a11a8dec4d10065fba233901 not found: ID does not exist" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.742628 4761 scope.go:117] "RemoveContainer" containerID="8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61" Dec 01 11:05:39 crc kubenswrapper[4761]: E1201 11:05:39.742925 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61\": container with ID starting with 8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61 not found: ID does not exist" containerID="8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.742963 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61"} err="failed to get container status \"8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61\": rpc error: code = NotFound desc = could not find container \"8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61\": container with ID starting with 8a8d89647af16e93caa6607c9f46808c28a86fc1be8b5f149c9c1a16c3205d61 not found: ID does not exist" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.765962 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b45f51-44d5-475f-a58f-05e023317582-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:39 crc kubenswrapper[4761]: I1201 11:05:39.997356 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bn9zb"] Dec 01 11:05:40 crc kubenswrapper[4761]: I1201 11:05:40.002034 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bn9zb"] Dec 01 11:05:41 crc kubenswrapper[4761]: I1201 11:05:41.140953 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b45f51-44d5-475f-a58f-05e023317582" path="/var/lib/kubelet/pods/89b45f51-44d5-475f-a58f-05e023317582/volumes" Dec 01 11:05:42 crc kubenswrapper[4761]: E1201 11:05:42.093666 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 11:05:42 crc kubenswrapper[4761]: E1201 11:05:42.094046 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:07:44.094028296 +0000 UTC m=+2203.397786930 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 11:05:42 crc kubenswrapper[4761]: E1201 11:05:42.094491 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 11:05:42 crc kubenswrapper[4761]: E1201 11:05:42.094540 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:07:44.094527769 +0000 UTC m=+2203.398286403 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 11:06:33 crc kubenswrapper[4761]: I1201 11:06:33.850737 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:06:33 crc kubenswrapper[4761]: I1201 11:06:33.851332 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:07:03 crc kubenswrapper[4761]: I1201 11:07:03.850778 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:07:03 crc kubenswrapper[4761]: I1201 11:07:03.851385 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:07:33 crc kubenswrapper[4761]: I1201 11:07:33.850476 4761 patch_prober.go:28] interesting pod/machine-config-daemon-qjx5r container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:07:33 crc kubenswrapper[4761]: I1201 11:07:33.851272 4761 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:07:33 crc kubenswrapper[4761]: I1201 11:07:33.851359 4761 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" Dec 01 11:07:33 crc kubenswrapper[4761]: I1201 11:07:33.852358 4761 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f00997390931adedaeec0ca2a4b2fb2cd14223bf986f1cc6a5dd25b14eaf01e2"} pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:07:33 crc kubenswrapper[4761]: I1201 11:07:33.852458 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" podUID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerName="machine-config-daemon" containerID="cri-o://f00997390931adedaeec0ca2a4b2fb2cd14223bf986f1cc6a5dd25b14eaf01e2" gracePeriod=600 Dec 01 11:07:34 crc kubenswrapper[4761]: I1201 11:07:34.875248 4761 generic.go:334] "Generic (PLEG): container finished" podID="eaf56ffe-a6c0-446a-81db-deae9bd72c7c" containerID="f00997390931adedaeec0ca2a4b2fb2cd14223bf986f1cc6a5dd25b14eaf01e2" exitCode=0 Dec 01 11:07:34 crc kubenswrapper[4761]: I1201 11:07:34.875352 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerDied","Data":"f00997390931adedaeec0ca2a4b2fb2cd14223bf986f1cc6a5dd25b14eaf01e2"} Dec 01 11:07:34 crc kubenswrapper[4761]: I1201 11:07:34.875864 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qjx5r" event={"ID":"eaf56ffe-a6c0-446a-81db-deae9bd72c7c","Type":"ContainerStarted","Data":"4b8544f64a715ed2437d37e3472240b1c86af4dabe6d7c34319ebe0d95fcb3f6"} Dec 01 11:07:34 crc kubenswrapper[4761]: I1201 11:07:34.875917 4761 scope.go:117] "RemoveContainer" containerID="684448e2a3e9ac1f7ae3a26a311269b4dd491871939b4962d37a7f2c78d1ebc8" Dec 01 11:07:44 crc kubenswrapper[4761]: E1201 11:07:44.105858 4761 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Dec 01 11:07:44 crc kubenswrapper[4761]: E1201 11:07:44.105914 4761 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Dec 01 11:07:44 crc kubenswrapper[4761]: E1201 11:07:44.106473 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:09:46.106450859 +0000 UTC m=+2325.410209513 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config-secret") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : secret "openstack-config-secret" not found Dec 01 11:07:44 crc kubenswrapper[4761]: E1201 11:07:44.106594 4761 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config podName:3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3 nodeName:}" failed. No retries permitted until 2025-12-01 11:09:46.106523801 +0000 UTC m=+2325.410282465 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3-openstack-config") pod "openstackclient" (UID: "3ad227ac-66b6-4a9d-b5a8-adbf86fb8ba3") : configmap "openstack-config" not found Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.715392 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dbvzj"] Dec 01 11:08:19 crc kubenswrapper[4761]: E1201 11:08:19.716829 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b45f51-44d5-475f-a58f-05e023317582" containerName="registry-server" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.716867 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b45f51-44d5-475f-a58f-05e023317582" containerName="registry-server" Dec 01 11:08:19 crc kubenswrapper[4761]: E1201 11:08:19.716893 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b45f51-44d5-475f-a58f-05e023317582" containerName="extract-content" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.716912 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b45f51-44d5-475f-a58f-05e023317582" containerName="extract-content" Dec 01 11:08:19 crc kubenswrapper[4761]: E1201 11:08:19.716949 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639cd093-0007-45df-b8f9-e2c36cb54554" containerName="gather" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.716962 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="639cd093-0007-45df-b8f9-e2c36cb54554" containerName="gather" Dec 01 11:08:19 crc kubenswrapper[4761]: E1201 11:08:19.716993 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639cd093-0007-45df-b8f9-e2c36cb54554" containerName="copy" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.717008 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="639cd093-0007-45df-b8f9-e2c36cb54554" containerName="copy" Dec 01 11:08:19 crc kubenswrapper[4761]: E1201 11:08:19.717032 4761 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b45f51-44d5-475f-a58f-05e023317582" containerName="extract-utilities" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.722954 4761 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b45f51-44d5-475f-a58f-05e023317582" containerName="extract-utilities" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.723332 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="639cd093-0007-45df-b8f9-e2c36cb54554" containerName="gather" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.723382 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b45f51-44d5-475f-a58f-05e023317582" containerName="registry-server" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.723420 4761 memory_manager.go:354] "RemoveStaleState removing state" podUID="639cd093-0007-45df-b8f9-e2c36cb54554" containerName="copy" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.725325 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.728721 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbvzj"] Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.805784 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbdff\" (UniqueName: \"kubernetes.io/projected/b78cebe4-f405-4061-a163-33db877f5be6-kube-api-access-xbdff\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.805835 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-catalog-content\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.805954 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-utilities\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.907252 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-utilities\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.907340 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbdff\" (UniqueName: \"kubernetes.io/projected/b78cebe4-f405-4061-a163-33db877f5be6-kube-api-access-xbdff\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.907389 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-catalog-content\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.907796 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-utilities\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.907894 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-catalog-content\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:19 crc kubenswrapper[4761]: I1201 11:08:19.938250 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbdff\" (UniqueName: \"kubernetes.io/projected/b78cebe4-f405-4061-a163-33db877f5be6-kube-api-access-xbdff\") pod \"redhat-marketplace-dbvzj\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:20 crc kubenswrapper[4761]: I1201 11:08:20.080162 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:20 crc kubenswrapper[4761]: I1201 11:08:20.597722 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbvzj"] Dec 01 11:08:21 crc kubenswrapper[4761]: I1201 11:08:21.218412 4761 generic.go:334] "Generic (PLEG): container finished" podID="b78cebe4-f405-4061-a163-33db877f5be6" containerID="66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d" exitCode=0 Dec 01 11:08:21 crc kubenswrapper[4761]: I1201 11:08:21.218675 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbvzj" event={"ID":"b78cebe4-f405-4061-a163-33db877f5be6","Type":"ContainerDied","Data":"66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d"} Dec 01 11:08:21 crc kubenswrapper[4761]: I1201 11:08:21.218837 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbvzj" event={"ID":"b78cebe4-f405-4061-a163-33db877f5be6","Type":"ContainerStarted","Data":"3183e0a09a939ad511473e6d37e78c76cdc31ca084a54f966c4add5936cfa6d8"} Dec 01 11:08:22 crc kubenswrapper[4761]: I1201 11:08:22.227431 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbvzj" event={"ID":"b78cebe4-f405-4061-a163-33db877f5be6","Type":"ContainerStarted","Data":"63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82"} Dec 01 11:08:23 crc kubenswrapper[4761]: I1201 11:08:23.238864 4761 generic.go:334] "Generic (PLEG): container finished" podID="b78cebe4-f405-4061-a163-33db877f5be6" containerID="63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82" exitCode=0 Dec 01 11:08:23 crc kubenswrapper[4761]: I1201 11:08:23.238954 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbvzj" event={"ID":"b78cebe4-f405-4061-a163-33db877f5be6","Type":"ContainerDied","Data":"63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82"} Dec 01 11:08:23 crc kubenswrapper[4761]: I1201 11:08:23.241815 4761 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:08:23 crc kubenswrapper[4761]: I1201 11:08:23.916954 4761 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86bh8"] Dec 01 11:08:23 crc kubenswrapper[4761]: I1201 11:08:23.928968 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:23 crc kubenswrapper[4761]: I1201 11:08:23.973398 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86bh8"] Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.074300 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b92b21-1fe3-4ad5-98fd-50e5f8906897-catalog-content\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.074418 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b92b21-1fe3-4ad5-98fd-50e5f8906897-utilities\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.074461 4761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvvzx\" (UniqueName: \"kubernetes.io/projected/38b92b21-1fe3-4ad5-98fd-50e5f8906897-kube-api-access-xvvzx\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.175622 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvvzx\" (UniqueName: \"kubernetes.io/projected/38b92b21-1fe3-4ad5-98fd-50e5f8906897-kube-api-access-xvvzx\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.175769 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b92b21-1fe3-4ad5-98fd-50e5f8906897-catalog-content\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.175821 4761 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b92b21-1fe3-4ad5-98fd-50e5f8906897-utilities\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.176426 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b92b21-1fe3-4ad5-98fd-50e5f8906897-utilities\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.177051 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b92b21-1fe3-4ad5-98fd-50e5f8906897-catalog-content\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.208413 4761 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvvzx\" (UniqueName: \"kubernetes.io/projected/38b92b21-1fe3-4ad5-98fd-50e5f8906897-kube-api-access-xvvzx\") pod \"community-operators-86bh8\" (UID: \"38b92b21-1fe3-4ad5-98fd-50e5f8906897\") " pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.250400 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbvzj" event={"ID":"b78cebe4-f405-4061-a163-33db877f5be6","Type":"ContainerStarted","Data":"18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b"} Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.262172 4761 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.272117 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dbvzj" podStartSLOduration=2.781047505 podStartE2EDuration="5.272099747s" podCreationTimestamp="2025-12-01 11:08:19 +0000 UTC" firstStartedPulling="2025-12-01 11:08:21.220799086 +0000 UTC m=+2240.524557760" lastFinishedPulling="2025-12-01 11:08:23.711851348 +0000 UTC m=+2243.015610002" observedRunningTime="2025-12-01 11:08:24.270793414 +0000 UTC m=+2243.574552038" watchObservedRunningTime="2025-12-01 11:08:24.272099747 +0000 UTC m=+2243.575858381" Dec 01 11:08:24 crc kubenswrapper[4761]: I1201 11:08:24.600254 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86bh8"] Dec 01 11:08:25 crc kubenswrapper[4761]: I1201 11:08:25.270023 4761 generic.go:334] "Generic (PLEG): container finished" podID="38b92b21-1fe3-4ad5-98fd-50e5f8906897" containerID="c4cb3fa7cfef5fd137f598de41b83ee6b6602daec66958baff4868a8ddbc7155" exitCode=0 Dec 01 11:08:25 crc kubenswrapper[4761]: I1201 11:08:25.270206 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bh8" event={"ID":"38b92b21-1fe3-4ad5-98fd-50e5f8906897","Type":"ContainerDied","Data":"c4cb3fa7cfef5fd137f598de41b83ee6b6602daec66958baff4868a8ddbc7155"} Dec 01 11:08:25 crc kubenswrapper[4761]: I1201 11:08:25.271390 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bh8" event={"ID":"38b92b21-1fe3-4ad5-98fd-50e5f8906897","Type":"ContainerStarted","Data":"2b76fce6ca2c12050f40c7fab640d0b832ca16971931bb90fb31641e3e071114"} Dec 01 11:08:30 crc kubenswrapper[4761]: I1201 11:08:30.081305 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:30 crc kubenswrapper[4761]: I1201 11:08:30.081898 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:30 crc kubenswrapper[4761]: I1201 11:08:30.135713 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:30 crc kubenswrapper[4761]: I1201 11:08:30.325092 4761 generic.go:334] "Generic (PLEG): container finished" podID="38b92b21-1fe3-4ad5-98fd-50e5f8906897" containerID="d2d059d675b78adb48f154b49fd4bc2294ba358baf4cc83669036ef00a182eaa" exitCode=0 Dec 01 11:08:30 crc kubenswrapper[4761]: I1201 11:08:30.325173 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bh8" event={"ID":"38b92b21-1fe3-4ad5-98fd-50e5f8906897","Type":"ContainerDied","Data":"d2d059d675b78adb48f154b49fd4bc2294ba358baf4cc83669036ef00a182eaa"} Dec 01 11:08:30 crc kubenswrapper[4761]: I1201 11:08:30.387124 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:31 crc kubenswrapper[4761]: I1201 11:08:31.374173 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbvzj"] Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.342837 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86bh8" event={"ID":"38b92b21-1fe3-4ad5-98fd-50e5f8906897","Type":"ContainerStarted","Data":"8d4b77bb4ff1578d59667b4d746604b2f99bb358c88497e5f90bd30458279be7"} Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.343045 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dbvzj" podUID="b78cebe4-f405-4061-a163-33db877f5be6" containerName="registry-server" containerID="cri-o://18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b" gracePeriod=2 Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.405155 4761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86bh8" podStartSLOduration=3.221955156 podStartE2EDuration="9.405126891s" podCreationTimestamp="2025-12-01 11:08:23 +0000 UTC" firstStartedPulling="2025-12-01 11:08:25.272338035 +0000 UTC m=+2244.576096679" lastFinishedPulling="2025-12-01 11:08:31.45550975 +0000 UTC m=+2250.759268414" observedRunningTime="2025-12-01 11:08:32.398188933 +0000 UTC m=+2251.701947567" watchObservedRunningTime="2025-12-01 11:08:32.405126891 +0000 UTC m=+2251.708885545" Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.762180 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.943498 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-catalog-content\") pod \"b78cebe4-f405-4061-a163-33db877f5be6\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.943679 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-utilities\") pod \"b78cebe4-f405-4061-a163-33db877f5be6\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.943719 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbdff\" (UniqueName: \"kubernetes.io/projected/b78cebe4-f405-4061-a163-33db877f5be6-kube-api-access-xbdff\") pod \"b78cebe4-f405-4061-a163-33db877f5be6\" (UID: \"b78cebe4-f405-4061-a163-33db877f5be6\") " Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.944803 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-utilities" (OuterVolumeSpecName: "utilities") pod "b78cebe4-f405-4061-a163-33db877f5be6" (UID: "b78cebe4-f405-4061-a163-33db877f5be6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.950222 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78cebe4-f405-4061-a163-33db877f5be6-kube-api-access-xbdff" (OuterVolumeSpecName: "kube-api-access-xbdff") pod "b78cebe4-f405-4061-a163-33db877f5be6" (UID: "b78cebe4-f405-4061-a163-33db877f5be6"). InnerVolumeSpecName "kube-api-access-xbdff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:32 crc kubenswrapper[4761]: I1201 11:08:32.978917 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b78cebe4-f405-4061-a163-33db877f5be6" (UID: "b78cebe4-f405-4061-a163-33db877f5be6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.045842 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.045883 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78cebe4-f405-4061-a163-33db877f5be6-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.045898 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbdff\" (UniqueName: \"kubernetes.io/projected/b78cebe4-f405-4061-a163-33db877f5be6-kube-api-access-xbdff\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.352756 4761 generic.go:334] "Generic (PLEG): container finished" podID="b78cebe4-f405-4061-a163-33db877f5be6" containerID="18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b" exitCode=0 Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.352845 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbvzj" event={"ID":"b78cebe4-f405-4061-a163-33db877f5be6","Type":"ContainerDied","Data":"18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b"} Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.352915 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbvzj" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.352924 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbvzj" event={"ID":"b78cebe4-f405-4061-a163-33db877f5be6","Type":"ContainerDied","Data":"3183e0a09a939ad511473e6d37e78c76cdc31ca084a54f966c4add5936cfa6d8"} Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.352943 4761 scope.go:117] "RemoveContainer" containerID="18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.373682 4761 scope.go:117] "RemoveContainer" containerID="63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.381628 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbvzj"] Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.388246 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbvzj"] Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.393612 4761 scope.go:117] "RemoveContainer" containerID="66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.428996 4761 scope.go:117] "RemoveContainer" containerID="18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b" Dec 01 11:08:33 crc kubenswrapper[4761]: E1201 11:08:33.429480 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b\": container with ID starting with 18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b not found: ID does not exist" containerID="18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.429519 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b"} err="failed to get container status \"18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b\": rpc error: code = NotFound desc = could not find container \"18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b\": container with ID starting with 18c82e2097ba21c338621fb2bda12aaebcd31caad1c1eb1bf5faa4197c0cae0b not found: ID does not exist" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.429559 4761 scope.go:117] "RemoveContainer" containerID="63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82" Dec 01 11:08:33 crc kubenswrapper[4761]: E1201 11:08:33.429858 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82\": container with ID starting with 63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82 not found: ID does not exist" containerID="63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.429896 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82"} err="failed to get container status \"63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82\": rpc error: code = NotFound desc = could not find container \"63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82\": container with ID starting with 63a989dd8b5605c3041cfb8afe60045c3b0cb9a0cff0f5d9ed25cd448b8a4b82 not found: ID does not exist" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.429924 4761 scope.go:117] "RemoveContainer" containerID="66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d" Dec 01 11:08:33 crc kubenswrapper[4761]: E1201 11:08:33.430461 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d\": container with ID starting with 66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d not found: ID does not exist" containerID="66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d" Dec 01 11:08:33 crc kubenswrapper[4761]: I1201 11:08:33.430518 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d"} err="failed to get container status \"66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d\": rpc error: code = NotFound desc = could not find container \"66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d\": container with ID starting with 66d82e4770ee2c2a69f1fe1910641fedae46a7dc38e21b192a969a836a050b3d not found: ID does not exist" Dec 01 11:08:34 crc kubenswrapper[4761]: I1201 11:08:34.262997 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:34 crc kubenswrapper[4761]: I1201 11:08:34.263067 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:34 crc kubenswrapper[4761]: I1201 11:08:34.333109 4761 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:35 crc kubenswrapper[4761]: I1201 11:08:35.136675 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78cebe4-f405-4061-a163-33db877f5be6" path="/var/lib/kubelet/pods/b78cebe4-f405-4061-a163-33db877f5be6/volumes" Dec 01 11:08:44 crc kubenswrapper[4761]: I1201 11:08:44.324684 4761 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86bh8" Dec 01 11:08:44 crc kubenswrapper[4761]: I1201 11:08:44.418625 4761 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86bh8"] Dec 01 11:08:44 crc kubenswrapper[4761]: I1201 11:08:44.462546 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6sdd"] Dec 01 11:08:44 crc kubenswrapper[4761]: I1201 11:08:44.463028 4761 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p6sdd" podUID="d5c8ad76-1c9b-4463-84cd-9b4501f80f8b" containerName="registry-server" containerID="cri-o://d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b" gracePeriod=2 Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.326435 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6sdd" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.450924 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9krgl\" (UniqueName: \"kubernetes.io/projected/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-kube-api-access-9krgl\") pod \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.450988 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-catalog-content\") pod \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.451079 4761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-utilities\") pod \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\" (UID: \"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b\") " Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.451732 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-utilities" (OuterVolumeSpecName: "utilities") pod "d5c8ad76-1c9b-4463-84cd-9b4501f80f8b" (UID: "d5c8ad76-1c9b-4463-84cd-9b4501f80f8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.463819 4761 generic.go:334] "Generic (PLEG): container finished" podID="d5c8ad76-1c9b-4463-84cd-9b4501f80f8b" containerID="d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b" exitCode=0 Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.463993 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6sdd" event={"ID":"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b","Type":"ContainerDied","Data":"d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b"} Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.464073 4761 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6sdd" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.464088 4761 scope.go:117] "RemoveContainer" containerID="d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.464068 4761 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6sdd" event={"ID":"d5c8ad76-1c9b-4463-84cd-9b4501f80f8b","Type":"ContainerDied","Data":"10ee87e594d46230670ea9b2af7c2c6d8c8615b8c05ba68af1c664d47108abc6"} Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.472412 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-kube-api-access-9krgl" (OuterVolumeSpecName: "kube-api-access-9krgl") pod "d5c8ad76-1c9b-4463-84cd-9b4501f80f8b" (UID: "d5c8ad76-1c9b-4463-84cd-9b4501f80f8b"). InnerVolumeSpecName "kube-api-access-9krgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.487008 4761 scope.go:117] "RemoveContainer" containerID="d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.499375 4761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5c8ad76-1c9b-4463-84cd-9b4501f80f8b" (UID: "d5c8ad76-1c9b-4463-84cd-9b4501f80f8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.503525 4761 scope.go:117] "RemoveContainer" containerID="a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.518501 4761 scope.go:117] "RemoveContainer" containerID="d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b" Dec 01 11:08:45 crc kubenswrapper[4761]: E1201 11:08:45.518984 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b\": container with ID starting with d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b not found: ID does not exist" containerID="d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.519042 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b"} err="failed to get container status \"d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b\": rpc error: code = NotFound desc = could not find container \"d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b\": container with ID starting with d8119f1fd81a48df65a5343f73df9346159c756ceb5bf5104251bb0b79b5997b not found: ID does not exist" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.519076 4761 scope.go:117] "RemoveContainer" containerID="d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b" Dec 01 11:08:45 crc kubenswrapper[4761]: E1201 11:08:45.519381 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b\": container with ID starting with d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b not found: ID does not exist" containerID="d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.519413 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b"} err="failed to get container status \"d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b\": rpc error: code = NotFound desc = could not find container \"d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b\": container with ID starting with d4d50ec0bddf30e16e1808de52085f502ac2abcdae0a03b541460e3c10effd4b not found: ID does not exist" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.519444 4761 scope.go:117] "RemoveContainer" containerID="a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717" Dec 01 11:08:45 crc kubenswrapper[4761]: E1201 11:08:45.519696 4761 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717\": container with ID starting with a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717 not found: ID does not exist" containerID="a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.519729 4761 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717"} err="failed to get container status \"a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717\": rpc error: code = NotFound desc = could not find container \"a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717\": container with ID starting with a0870f88eb7a5800ec2faf74d889233696ffe0cbef41d07762df8e414bd92717 not found: ID does not exist" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.552400 4761 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.552430 4761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9krgl\" (UniqueName: \"kubernetes.io/projected/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-kube-api-access-9krgl\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.552465 4761 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.814501 4761 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6sdd"] Dec 01 11:08:45 crc kubenswrapper[4761]: I1201 11:08:45.831797 4761 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p6sdd"] Dec 01 11:08:47 crc kubenswrapper[4761]: I1201 11:08:47.142324 4761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c8ad76-1c9b-4463-84cd-9b4501f80f8b" path="/var/lib/kubelet/pods/d5c8ad76-1c9b-4463-84cd-9b4501f80f8b/volumes"